Fable Legends Early Preview: DirectX 12 Benchmark Analysisby Ryan Smith, Ian Cutress & Daniel Williams on September 24, 2015 9:00 AM EST
Update 2016/03/07: Well so much for that. Fable Legends has been canceled. So it will ultimately be another game that gets to claim the right as the first Unreal Engine 4 based DX12 game.
DirectX 12 is now out in the wild as a part of Windows 10 and the updated driver model WDDM 2.0 that comes with it. Unlike DX11, there are no major gaming titles at launch - we are now waiting for games to take advantage of DX12 and what difference it will make for the game playing experience. One of the main focal points of DX12 is draw calls, leveraging multiple processor cores to dispatch GPU workloads, rather than the previous model of a single core doing most of the work. DX12 brings about a lot of changes with the goal of increasing performance, offering an even more immersive experience, but it does shift some of the support requirements to the engine developers such as SLI or Crossfire. We tackled two synthetic tests earlier this year, Star Swarm and 3DMark, but due to timing and other industry events, we are waiting for a better time to test the Ashes of the Singularity benchmark as the game nears completion. Until that point, a PR team got in contact with us regarding the upcoming Fable Legends title using the Unreal 4 engine, and an early access preview benchmark that came with it. Here are our results so far.
Fable Legends is an Xbox One/Windows 10 exclusive free to play title built by Lionhead Studios in Unreal Engine 4. The game, styled as a ‘cooperative action RPG’, consists of asymmetrical multiplayer matches with attackers trying to raid a base and the defender playing more of a tower defense position.
The benchmark provided is more of a graphics showpiece than a representation of the gameplay, in order to show off the capabilities of the engine and the DX12 implementation. Unfortunately we didn't get to see any gameplay in this benchmark as a result, which would seem to focus more on combat. This is the one of the first DirectX 12 benchmarks available - Ashes of the Singularity by Stardock was released just before IDF, but due to scheduling we have not had a chance to dig into that one yet. This will be our first look at a DirectX 12 game engine with a game attached as a result.
This benchmark pans through several outdoor scenes in a fashion similar to the Unigene Valley benchmark, focusing more on landscapes, distance drawing and tessellation rather than an upfront first-person perspective. Graphical effects such as dynamic global illumination are computed on the fly, making subtle differences in the lighting and it shows the day/night cycle being accelerated, similar to the large Grand Theft Auto benchmark. The engine itself draws on DX12 explicit features such as ‘asynchronous compute, manual resource barrier tracking, and explicit memory management’ that either allow the application to better take advantage of available hardware or open up options that allow developers to better manage multi-threaded applications and GPU memory resources respectively. The updated engine has had several additions to implement these visual effects and has promised that use of DirectX 12 will help to improve both the experience and performance.
The software provided to us is a prerelease version of Fable Legends, with early drivers, so ultimately the performance at this point is most likely not representative of the game at launch and should improve before release. What we will see here is more of a broad picture painting how different GPUs will scale when DX12 features are thrown into the mix. In fact, AMD sent us a note that there is a new driver available specifically for this benchmark which should improve the scores on the Fury X, although it arrived too late for this pre-release look at Fable Legends (Ryan did the testing but is covering Samsung’s 950 Pro launch in Korea at this time). It can underscore just how early in the game and driver development cycle DirectX 12 is for all players. But as with most important titles, we expect drivers and software updates to continue to drive performance forward as developers and engineers come to understand how the new version of DirectX works.
With that being said, there does not appear to be any stability issues with the benchmark as it stands, and we have had time to test graphics cards going back a few generations for both AMD and NVIDIA. Our pre-release package came with three test standards at 1280x720, 1920x1080 and 4K. We also attempted to test a number of these combinations multiple CPU core and thread count simulations in order to emulate a number of popular CPUs in the market.
|CPU:||Intel Core i7-4960X in 3 modes:
'Core i7' - 6 Cores, 12 Threads at 4.2 GHz
'Core i5' - 4 Cores, 4 Threads at 3.8 GHz
'Core i3' - 2 Cores, 4 Threads at 3.8 GHz
|Motherboard:||ASRock Fatal1ty X79 Professional|
|Power Supply:||Corsair AX1200i|
|Hard Disk:||Samsung SSD 840 EVO (750GB)|
|Memory:||G.Skill RipjawZ DDR3-1866 4 x 8GB (9-10-9-26)|
|Case:||NZXT Phantom 630 Windowed Edition|
|Video Cards:||AMD Radeon R9 Fury X
AMD Radeon R9 290X
AMD Radeon R9 285
AMD Radeon HD 7970
NVIDIA GeForce GTX 980 Ti
NVIDIA GeForce GTX 970 (EVGA)
NVIDIA GeForce GTX 960
NVIDIA GeForce GTX 680
NVIDIA GeForce GTX 750 Ti
|Video Drivers:||NVIDIA Release 355.82
AMD Catalyst Cat 15.201.1102
All the results in this piece are on discrete GPUs. The benchmark outputs a score, which is merely the average frame rate multiplied by a hundred, but it also dumps an extensive data log where it tracks over 186 different elements of the system every frame, such as compute time for various effects for each frame. Our testing takes on three roles – direct GPU comparison of average frame rates at 1080p and 720p in our i7-4960X mode, CPU scaling at each resolution with the GTX 980 Ti and AMD Fury, X and then a deep analysis of the percentile data of these two graphics cards at each resolution and each CPU configuration.
Post Your CommentPlease log in or sign up to comment.
View All Comments
tackle70 - Thursday, September 24, 2015 - linkNice article. Maybe tech forums can now stop with the "AMD will be vastly superior to Nvidia in DX12" nonsense.
cmdrdredd - Thursday, September 24, 2015 - linkLeads me to believe more and more that Stardock is up to shenanigans just a bit or that not every game will use certain features that DX12 can perform and Nvidia is not held back in those games.
Jtaylor1986 - Thursday, September 24, 2015 - linkI'd say Ashes is a far more representative benchmark. What is the point of doing a landscape simulator benchmark. This demo isn't even trying to replicate real world performance
cmdrdredd - Thursday, September 24, 2015 - linkAre you nuts or what? This is a benchmark of the game engine used for Fable Legends. It's as good a benchmark as any when trying to determine performance in a specific game engine.
Jtaylor1986 - Thursday, September 24, 2015 - linkExcept its completely unrepresentative of actual gameplay unless this grass growing simulator.
Jtaylor1986 - Thursday, September 24, 2015 - link"The benchmark provided is more of a graphics showpiece than a representation of the gameplay, in order to show off the capabilities of the engine and the DX12 implementation. Unfortunately we didn't get to see any gameplay in this benchmark as a result, which would seem to focus more on combat."
LukaP - Thursday, September 24, 2015 - linkYou dont need gameplay in a benchmark. you need the benchmark to display common geometry, lighting, effects and physics of an engine/backend that drives certain games. And this benchmark does that. If you want to see gameplay, there are many terrific youtubers who focus on that, namely Markiplier, NerdCubed, TotalBiscuit and others
Mr Perfect - Thursday, September 24, 2015 - linkActual gameplay is still important in benchmarking, mainly because that's when framerates usually tank. An empty level can get fantastic FPS, but drop a dozen players having an intense fight into that level and performance goes to hell pretty fast. That's the situation where we hope to see DX12 outshine DX11.
Stuka87 - Thursday, September 24, 2015 - linkWrong, a benchmark without gameplay is worthless. Look at Battlefield 4 as an example. Its built in benchmarks are worthless. Once you join a 64 player server, everything changes.
This benchmark shows how a raw engine runs, but is not indicative of how the game will run at all.
Plus its super early in development with drivers that stil need work, which the article states that AMD's driver arrived too late.
inighthawki - Thursday, September 24, 2015 - linkYes, but when the goal is to show improvements in rendering performance, throwing someone into a 64 player match completely skews the results. The CPU overhead of handling a 64 player multiplayer match will far outweigh to small changes in CPU overhead from a new rendering API.