Graphics Performance Comparison

With the background and context of the benchmark covered, we now dig into the data and see what we have to look forward to with DirectX 12 game performance. This benchmark has preconfigured batch files that will launch the utility at either 3840x2160 (4K) with settings at ultra, 1920x1080 (1080p) also on ultra, or 1280x720 (720p) with low settings more suited for integrated graphics environments.

Fable Legends Beta: 3840x2160 Ultra, Core i7

Fable Legends Beta: 3840x2160 Ultra, Core i5

Fable Legends Beta: 3840x2160 Ultra, Core i3

When dealing with 3840x2160 resolution, the GTX 980 Ti has a single digit percentage lead over the AMD Fury X, but both are above the bare minimum of 30 FPS no matter what the CPU.

Fable Legends Beta: 1920x1080 Ultra, Core i7

Fable Legends Beta: 1920x1080 Ultra, Core i5

Fable Legends Beta: 1920x1080 Ultra, Core i3

When dealing with the i5 and i7 at 1920x1080 ultra settings, the GTX 980 Ti still has that single digit percentage lead, but at Core i3 levels of CPU power the difference is next to zero, suggesting we are CPU limited even though the frame difference from i3 to i5 is minimal. If we look at the range of cards under the Core i7 at this point, the interesting thing here is that the GTX 970 just about hits that 60 FPS mark, while some of the older generation cards (7970/GTX 680) would require compromises in the settings to push it over the 60 FPS barrier at this resolution. The GTX 750 Ti doesn’t come anywhere close, suggesting that this game (under these settings) is targeting upper mainstream to lower high end cards. It would be interesting to see if there is an overriding game setting that ends up crippling this level of GPU.

Fable Legends Beta: 1280x720 Low, Core i7

Fable Legends Beta: 1280x720 Low, Core i5

Fable Legends Beta: 1280x720 Low, Core i3

At the 720p low settings, the Core i7 pushes everything above 60 FPS, but you need at least an AMD 7970/GTX 960 to start going for 120 FPS if only for high refresh rate panels. We are likely being held back by CPU performance as illustrated by the GTX 970 and GTX 980 Ti being practically tied and the R9 290X stepping ahead of the pack. This makes it interesting when we consider integrated graphics, which we might test for a later article.  It is worth noting that at the low resolution, the R9 290X and Fury X pull out a minor lead over the NVIDIA cards. The Fury X expands this lead with the i5 and i3 configurations, just rolling over to the double digit percentage gains.

Fable Legends Early Preview: DirectX 12 Benchmark Analysis CPU Scaling
Comments Locked


View All Comments

  • TheJian - Saturday, September 26, 2015 - link

    "There is a big caveat to remember, though. In power consumption tests, our GPU test rig pulled 449W at the wall socket when equipped with an R9 390X, versus 282W with a GTX 980. The delta between the R9 390 and GTX 970 was similar, at 121W. "

    You seem to see through rose colored glasses. At these kinds of watt differences you SHOULD dominate everything...LOL. Meanwhile NV guys have plenty of watts to OC and laugh. Your completely ignoring the cost of watts these days when talking a 100w bulb for hours on end for 3-7yrs many of us have our cards. You're also forgetting that most cards can hit strix speeds anyway right? NOBODY buys stock when you can buy an OC version from all vendors for not much more.

    "Early tests have shown that the scheduling hardware in AMD's graphics chips tends to handle async compute much more gracefully than Nvidia's chips do. That may be an advantage AMD carries over into the DX12 generation of games. However, Nvidia says its Maxwell chips can support async compute in hardware—it's just not enabled yet. We'll have to see how well async compute works on newer GeForces once Nvidia turns on its hardware support."

    Also seem to ignore that from your own link (techreport), they even state NV has async turned off for now. I'm guessing just waiting for all the DX12 stuff to hit, see if AMD can catch them, then boom, hello more perf...LOL.
    "Thanks in part to that humongous cooler, the Strix has easily the highest default clock speeds of any card in this group, with a 1216MHz base and 1317MHz boost"
    A little less than you say, but yes, NV gives you free room to run to WHATEVER your card can do in the allowed limit. Unlike AMD's UP TO crap, with NV you get GUARANTEED X, and more if available. I prefer the latter. $669 at amazon for the STRIX, so for $20 I'll take the massive gain in perf (cheapest at newegg is $650 for 980ti). I'll get it back in watts saved on electricity in no time. You completely ignore Total Cost of Ownership, not to mention DRIVERS and how RARE AMD drops are. NV puts out a WHQL driver monthly or more.
    Any time you offer me ~15% perf for 3% cost I'll take it. If you tell me electric costs mean nothing, in the same sentence I'll tell you $20 means nothing then, on the price of card most live with for years.

    Frostbite is NOT brand agnostic. Cough, Mantle, 8mil funding, Cough...The fact that MANY games run better in DX11 for Nv is just DRIVERS and time spent with DEVS (Witcher3, Project Cars etc, devs said this). This should be no surprise when R&D is down for 4yrs at AMD while the reverse is true at NV (who now spends more on R&D then AMD who has a larger product line).

    Shocker ASHES looks good for AMD when it was a MANTLE engine game...ROFL. Jeez guy...Even more funny that once NV optimized for Star Swarm they had massive DX12 improvements and BEAT AMD in it, and not to mention the massive DX11 improvement too (which AMD ignored). Gamers should look at who has the funding to keep up in DX11 for a while too correct? AMD seems to have moved on to dx12 (not good for those poor gamers who can't afford new stuff right?). You seem to only see your arguments for YOUR side. Near as I can see, NV looks good until you concentrate where I will not play (1280x720, or crap cpus). Also, you're basing all your conclusions on BETA games and current state of drivers before any of this stuff is real...LOL. You can call unreal 4 engine unrealistic, but I'll remind you it is used in TONS of games over the last two decades so AMD better be good here at some point. You can't repeatedly lose in one of the most prolific engines on the planet right? You can't just claim "that engine is biased" and ignore the fact that it is REALITY that it will be used a LOT. If all engines were BIASED towards AMD, I would buy AMD no matter what NV put out if AMD wins everything...ROFL. I don't care about the engine, I care about the result of the cards running on the games I play. IF NV pays off every engine designer, I'll buy NV because...well, DUH. You can whine all you want, but GAMERS are buying 82% NV for a reason. I bought INTEL i7 for a REASON. I don't care if they cheat, pay someone off, use proprietary tech etc, as long as they win, I'll buy it. I might complain about the cheating, but if it wins, I'll buy it anyway...LOL.

    IE, I don't have to LIKE Donald Trump to understand he knows how to MAKE money, unlike most of congress/Potus. He's pretty famous for FIRING people too, which again, congress/potus have no idea how to get done apparently. They also have no idea how to manage a budget, which again, TRUMP does. They have no idea how to protect the border, despite claiming they'll do it for a decade or two. I'll take that WALL please trump (which works in israel, china, etc), no matter how much it costs compared to decades of welfare losses, education dropping, medical going to illegals etc. The wall is CHEAP (like an NV card over 3-7yrs of usage at 120w+ or more savings as your link shows). I can hate trump (or Intel, or NV) and still recognize the value of his business skills, negotiation skills, firing skills, budget skills etc. Get it? If ZEN doesn't BURY Intel in perf, I'll buy another i7 for my dad...LOL.
    Even anandtech hit strix speeds with ref. Core clocks of 250mhz free on 1000mhz? OK, sign me up. 4 months later likely everything does this or more as manufacturing only improves over time. All of NV cards OC well except for the bottom rungs. Call me when AMD wins where most gamers play (above 720P and with good cpus). Yes DX12 bodes well for poor people, and AMD's crap cpus. But I'm neither. Hopefully ZEN fixes the cpu side so I can buy AMD again. They still have a shot at my die shrunk gpu next year too, but not if they completely ignore DX11, keep failing to put out game ready drivers, lose the watt war etc. ZEN's success (or not) will probably influence my gpu sale too. If ZEN benchmarks suck there will probably be no profits to make my gpu drivers better etc. Think BIGGER.
  • anubis44 - Friday, October 30, 2015 - link

    As already mentioned, nVidia pulled out the seats, the parachutes and anything else they could unscrew and thew them out of the airplane to lighten the load. Maxwell's low-power usage comes at a price, like no hardware based scheduler, and now DX12 games will frequently make use of this for context switching and dynamic reallocation of shaders between rendering and compute. Why? Because the XBOX One and the PS4, having AMD Radeon graphics CGN cores, can do this. So in the interest of getting the power usage down, nVidia left out a hardware feature even the PS4 and XBOX One GPUs have. Does that sound smart? It's called 'marketing': "Hey look! Our card uses LESS POWER than the Radeon! It's because we're using super-duper, secret technologies!" No, you're leaving stuff off the die. No wonder it uses less power.
  • RussianSensation - Thursday, September 24, 2015 - link

    925mhz HD7970 is beating GTX960 by 32%. R9 280X currently sells for $190 on Newegg and it has another 13.5% increase in GPU clocks, which implies it would beat 960 by a whopping 40-45%!

    R9 290X beating 970 by 13% in a UE4 engine is extremely uncharacteristic. I can't recall this ever happen. Also, other sites are showing $280 R9 390 on the heels of the $450 GTX980.

    That's an extremely bad showing for NV in each competing pricing segment, except for the 980Ti card. And because UE4 has significantly favoured NV's cards under DX11, this is actually a game engine that should have favoured NV's Maxwell as much as possible. Now imagine DX12 in a brand agnostic game engine like CryEngine or Frostbite?

    At the end it's not going to matter to gamers who upgrade every 2 years but for budget gamers who cannot afford to do so, they should pay attention.
  • CiccioB - Friday, September 25, 2015 - link

    925mhz HD7970 is beating GTX960 by 32%

    Ahahahah.. and that should prove that? A chip twice ad big and consuming twice the energy can perform 32% more than another?
    Oh, sorry, you were speaking about prices... yes... so you are just claiming that that power sucking beast has hard time selling like the winning micro hero that is filing nvidia's pokets while the competing can only be obtained when a stock cleaning operation is done?
    Can't really understand these kind of comparisons. GTX960 runs against Radeon 285 or now 380 card. It performs fantastically for the size of its die and the power it sucks. And has pretty cornered AMD margins on boards that mount beefy GPU like Tahiti or Tonga.
    The only hope for AMD to come out of this pitiful situation is to hope that with next generation and new PP perfomance/die space ratios are closer to competition, or they won't go to gain a singe cent out of graphics division for a few years again.
  • The_Countess - Friday, September 25, 2015 - link

    ya you seem to have forgotten that the hd7970 is 3+ years old while the gtx960 was released this year. and it has only 30% more transistors (~4.3billion vs ~3)

    and the only reason nvidia's power consumption is better is because they cut double precision performance on all their cards down to nothing.
  • MapRef41N93W - Saturday, September 26, 2015 - link

    So wrong it's not even funny. Maybe you aren't aware of this, but small die Kepler already had DP cut. Only GK100/GK110 had full DP with Kepler. That has nothing to do with why GM204/206 have such low power draw. The Maxwell architecture is the main reason.
  • Azix - Saturday, September 26, 2015 - link

    cut hardware scheduler?
  • Asomething - Sunday, September 27, 2015 - link

    Sorry to burst your bubble but nvidia didnt cut DP completely on small keplar, they cut down some from fermi but disabled the rest so they could keep DP on their Quadro series, there were softmods to unlock that DP, for maxwell they did actually completely cut DP to save on die space and power consumption. amd did the same for GCN1.2's fiji in order to get it on 28nm.
  • CiccioB - Monday, September 28, 2015 - link

    I don't really care how old is Tahiti. I know it was used as comparison with a chip which is half its size and power consumption ON THE SAME PP. So how old it is doesn't really matter. Same PP, so what's should be important is how good both architectures are.
    What counts is that AMD has not done anything radical to improve its architecture. It replaced Tahiti with a similar beefy GPU, Tonga, which didn't really stand a chance against Maxwell. They were the new proposal of both companies. Maxwell vs GCN 1.2. See the results.
    So again, go and look at how big GM206 is and how much power it sucks. Then compare with Tonga and the only thing you can see as similar is the price. nvidia solution beats AMD one under all points of view bringing AMD margins to nothing, though nvidia is still selling its GPU at a higher price than it really deserves.
    In reality one should compare Tahiti/Tonga with GM204 for the size and power consumption. The results will simply put AMD GCN architecture into the toilet. Only reasonable move was to lower the price so much that they could sell a higher tier GPU into a lower series of boards.
    Performance based on die space and power consumption doesn't really make GCN a hero in nothing but in having worsened AMD position even more with respect to old VLIW architecture were AMD fought with similar performances but smaller dies (and power consumption).
  • CiccioB - Monday, September 28, 2015 - link

    Forgot.. about double precision... I still don't care about it. Do you use it in your everyday life? How many professional boards is AMD selling that justifies the use of DP units into such GPUs?
    Just for numbers on the well painted box? So DP is a no necessity for 99% off the users.

    And apart that stupid thing, nvidia DP units were not present on GK204/206 as well, so the big efficiency gain has been made by improving their architecture (from Kepler to Maxwell) while AMD just moved from GCN 1.0 to GCN 1.2 with almost null efficiency results.
    The problem is not DP units present or not. It is that AMD could not make its already struggling architecture better in absolute with respect to the old version. An with Fiji they demonstrated that they could even do worse, if someone had any doubts.

Log in

Don't have an account? Sign up now