Graphics Performance Comparison

With the background and context of the benchmark covered, we now dig into the data and see what we have to look forward to with DirectX 12 game performance. This benchmark has preconfigured batch files that will launch the utility at either 3840x2160 (4K) with settings at ultra, 1920x1080 (1080p) also on ultra, or 1280x720 (720p) with low settings more suited for integrated graphics environments.

Fable Legends Beta: 3840x2160 Ultra, Core i7

Fable Legends Beta: 3840x2160 Ultra, Core i5

Fable Legends Beta: 3840x2160 Ultra, Core i3

When dealing with 3840x2160 resolution, the GTX 980 Ti has a single digit percentage lead over the AMD Fury X, but both are above the bare minimum of 30 FPS no matter what the CPU.

Fable Legends Beta: 1920x1080 Ultra, Core i7

Fable Legends Beta: 1920x1080 Ultra, Core i5

Fable Legends Beta: 1920x1080 Ultra, Core i3

When dealing with the i5 and i7 at 1920x1080 ultra settings, the GTX 980 Ti still has that single digit percentage lead, but at Core i3 levels of CPU power the difference is next to zero, suggesting we are CPU limited even though the frame difference from i3 to i5 is minimal. If we look at the range of cards under the Core i7 at this point, the interesting thing here is that the GTX 970 just about hits that 60 FPS mark, while some of the older generation cards (7970/GTX 680) would require compromises in the settings to push it over the 60 FPS barrier at this resolution. The GTX 750 Ti doesn’t come anywhere close, suggesting that this game (under these settings) is targeting upper mainstream to lower high end cards. It would be interesting to see if there is an overriding game setting that ends up crippling this level of GPU.

Fable Legends Beta: 1280x720 Low, Core i7

Fable Legends Beta: 1280x720 Low, Core i5

Fable Legends Beta: 1280x720 Low, Core i3

At the 720p low settings, the Core i7 pushes everything above 60 FPS, but you need at least an AMD 7970/GTX 960 to start going for 120 FPS if only for high refresh rate panels. We are likely being held back by CPU performance as illustrated by the GTX 970 and GTX 980 Ti being practically tied and the R9 290X stepping ahead of the pack. This makes it interesting when we consider integrated graphics, which we might test for a later article.  It is worth noting that at the low resolution, the R9 290X and Fury X pull out a minor lead over the NVIDIA cards. The Fury X expands this lead with the i5 and i3 configurations, just rolling over to the double digit percentage gains.

Fable Legends Early Preview: DirectX 12 Benchmark Analysis CPU Scaling
Comments Locked

141 Comments

View All Comments

  • tackle70 - Thursday, September 24, 2015 - link

    Nice article. Maybe tech forums can now stop with the "AMD will be vastly superior to Nvidia in DX12" nonsense.
  • cmdrdredd - Thursday, September 24, 2015 - link

    Leads me to believe more and more that Stardock is up to shenanigans just a bit or that not every game will use certain features that DX12 can perform and Nvidia is not held back in those games.
  • Jtaylor1986 - Thursday, September 24, 2015 - link

    I'd say Ashes is a far more representative benchmark. What is the point of doing a landscape simulator benchmark. This demo isn't even trying to replicate real world performance
  • cmdrdredd - Thursday, September 24, 2015 - link

    Are you nuts or what? This is a benchmark of the game engine used for Fable Legends. It's as good a benchmark as any when trying to determine performance in a specific game engine.
  • Jtaylor1986 - Thursday, September 24, 2015 - link

    Except its completely unrepresentative of actual gameplay unless this grass growing simulator.
  • Jtaylor1986 - Thursday, September 24, 2015 - link

    "The benchmark provided is more of a graphics showpiece than a representation of the gameplay, in order to show off the capabilities of the engine and the DX12 implementation. Unfortunately we didn't get to see any gameplay in this benchmark as a result, which would seem to focus more on combat."
  • LukaP - Thursday, September 24, 2015 - link

    You dont need gameplay in a benchmark. you need the benchmark to display common geometry, lighting, effects and physics of an engine/backend that drives certain games. And this benchmark does that. If you want to see gameplay, there are many terrific youtubers who focus on that, namely Markiplier, NerdCubed, TotalBiscuit and others
  • Mr Perfect - Thursday, September 24, 2015 - link

    Actual gameplay is still important in benchmarking, mainly because that's when framerates usually tank. An empty level can get fantastic FPS, but drop a dozen players having an intense fight into that level and performance goes to hell pretty fast. That's the situation where we hope to see DX12 outshine DX11.
  • Stuka87 - Thursday, September 24, 2015 - link

    Wrong, a benchmark without gameplay is worthless. Look at Battlefield 4 as an example. Its built in benchmarks are worthless. Once you join a 64 player server, everything changes.

    This benchmark shows how a raw engine runs, but is not indicative of how the game will run at all.

    Plus its super early in development with drivers that stil need work, which the article states that AMD's driver arrived too late.
  • inighthawki - Thursday, September 24, 2015 - link

    Yes, but when the goal is to show improvements in rendering performance, throwing someone into a 64 player match completely skews the results. The CPU overhead of handling a 64 player multiplayer match will far outweigh to small changes in CPU overhead from a new rendering API.

Log in

Don't have an account? Sign up now