Gaming Performance 2015

The issue of FCLK settings might play a big role here. At launch, the default setting for the communication buffer between the CPU and PCIe stack was 800 MHz, even though Intel suggested 1000 MHz, but this was because of firmware limitations from Intel. Since then, there is firmware to enable 1000 MHz, and most motherboard manufacturers have this - but it is unclear if the motherboard will default to 1000 MHz and it might vary from BIOS version to BIOS version. As we test at default settings, our numbers are only ever snapshots in time, but it leads to some interesting differences in discrete GPU performance.

Alien: Isolation

If first person survival mixed with horror is your sort of thing, then Alien: Isolation, based off of the Alien franchise, should be an interesting title. Developed by The Creative Assembly and released in October 2014, Alien: Isolation has won numerous awards from Game Of The Year to several top 10s/25s and Best Horror titles, ratcheting up over a million sales by February 2015. Alien: Isolation uses a custom built engine which includes dynamic sound effects and should be fully multi-core enabled.

For low-end graphics, we test at 720p with Ultra settings, whereas for mid and high range graphics we bump this up to 1080p, taking the average frame rate as our marker with a scripted version of the built-in benchmark.

Alien: Isolation on NVIDIA GTX 770 2GB ($245)

Alien: Isolation on NVIDIA GTX 980 4GB ($560)

Total War: Attila

The Total War franchise moves on to Attila, another The Creative Assembly development, and is a stand-alone strategy title set in 395AD where the main story line lets the gamer take control of the leader of the Huns in order to conquer parts of the world. Graphically the game can render hundreds/thousands of units on screen at once, all with their individual actions and can put some of the big cards to task.

For low end graphics, we test at 720p with performance settings, recording the average frame rate. With mid and high range graphics, we test at 1080p with the quality setting. In both circumstances, unlimited video memory is enabled and the in-game scripted benchmark is used.

Total War: Attila on AMD R7 240 DDR3 2GB ($70)

Total War: Attila on NVIDIA GTX 770 2GB ($245)

Total War: Attila on NVIDIA GTX 980 4GB ($560)

Grand Theft Auto V

The highly anticipated iteration of the Grand Theft Auto franchise finally hit the shelves on April 14th 2015, with both AMD and NVIDIA in tow to help optimize the title. GTA doesn’t provide graphical presets, but opens up the options to users and extends the boundaries by pushing even the hardest systems to the limit using Rockstar’s Advanced Game Engine. Whether the user is flying high in the mountains with long draw distances or dealing with assorted trash in the city, when cranked up to maximum it creates stunning visuals but hard work for both the CPU and the GPU.

For our test we have scripted a version of the in-game benchmark, relying only on the final part which combines a flight scene along with an in-city drive-by followed by a tanker explosion. For low end systems we test at 720p on the lowest settings, whereas mid and high end graphics play at 1080p with very high settings across the board. We record both the average frame rate and the percentage of frames under 60 FPS (16.6ms).

Grand Theft Auto on AMD R7 240 DDR3 2GB ($70)

Grand Theft Auto on NVIDIA GTX 770 2GB ($245)

Grand Theft Auto on NVIDIA GTX 980 4GB ($560)

GRID: Autosport

No graphics tests are complete without some input from Codemasters and the EGO engine, which means for this round of testing we point towards GRID: Autosport, the next iteration in the GRID and racing genre. As with our previous racing testing, each update to the engine aims to add in effects, reflections, detail and realism, with Codemasters making ‘authenticity’ a main focal point for this version.

GRID’s benchmark mode is very flexible, and as a result we created a test race using a shortened version of the Red Bull Ring with twelve cars doing two laps. The car is focus starts last and is quite fast, but usually finishes second or third. For low end graphics we test at 1080p medium settings, whereas mid and high end graphics get the full 1080p maximum. Both the average and minimum frame rates are recorded.

GRID: Autosport on AMD R7 240 DDR3 2GB ($70)

GRID: Autosport on NVIDIA GTX 770 2GB ($245)

GRID: Autosport on NVIDIA GTX 980 4GB ($560)

Middle-Earth: Shadow of Mordor

The final title in our testing is another battle of system performance with the open world action-adventure title, Shadow of Mordor. Produced by Monolith using the LithTech Jupiter EX engine and numerous detail add-ons, SoM goes for detail and complexity to a large extent, despite having to be cut down from the original plans. The main story itself was written by the same writer as Red Dead Redemption, and it received Zero Punctuation’s Game of The Year in 2014.

For testing purposes, SoM gives a dynamic screen resolution setting, allowing us to render at high resolutions that are then scaled down to the monitor. As a result, we get several tests using the in-game benchmark. For low end graphics we examine at 720p with low settings, whereas mid and high end graphics get 1080p Ultra. The top graphics test is also redone at 3840x2160, also with Ultra settings, and we also test two cards at 4K where possible.

Shadow of Mordor on AMD R7 240 DDR3 2GB ($70)

Shadow of Mordor on NVIDIA GTX 770 2GB ($245)

Shadow of Mordor on NVIDIA GTX 980 4GB ($560)

Shadow of Mordor on NVIDIA GTX 980 4GB ($560)

CPU Performance, Short Form GIGABYTE X170-Extreme ECC Conclusion
Comments Locked

31 Comments

View All Comments

  • SetiroN - Monday, October 17, 2016 - link

    There is only one thing that's worse than camo: pixelized camo.

    I honestly fail to understand who in the world would ever buy a socket 1150 Xeon solution instead of socket 2011.
  • dave_the_nerd - Monday, October 17, 2016 - link

    1) Digital camo has been standard-issue in the military for a while now.

    2) Anybody who only needs a 4c/8t system, but is otherwise doing "workstation" or server-grade work. (Uptime requirements, longevity requirements, need ECC ram for data crunching, need virtualization features, etc.)
  • zepi - Monday, October 17, 2016 - link

    4c/8t LGA2011 solution hardly costs much more, especially since this board is approaching the pricing of workstation mobos...
  • Einy0 - Monday, October 17, 2016 - link

    2) The supposed advantages are 95% marketing. Uptime is more about your OS if you select quality components to go with the CPU. Longevity, seriously??? I can show you desktops built 30+ years ago that run today the same as they did then. How many CPUs actually die? I personally have had one die, it was 7 years old. Virtualization, again no more features on the 1151 Xeon versus the i7. ECC, that's the one feature an 1151 Xeon has over a similar i7. Now when we start talking multi-socket and what not well that's obvious. I've had these conversations in the past with engineers and developers at work. Everyone just assumes that when Intel says they need a Xeon to do something there is a reason. Yup, there is a reason, they can make more money from the same chip with a Xeon badge on it.
  • HollyDOL - Tuesday, October 18, 2016 - link

    Yep, you can show 30 old desktops still working, but how many of them were running 24/7? None.
  • mkaibear - Tuesday, October 18, 2016 - link

    Up until very recently I had a desktop of about that vintage running SCO Unix. That ran 24/7. In fact we were scared to turn it off because it ran chunks of the factory...
  • devol - Saturday, October 22, 2016 - link

    There are more differences than just ECC memory. For instance i7 cpu's don't support hugetlb/hugepages, and several other 'server' focused virtualization extensions. Until Skylake though, the PCH had basically no support for needed features for SR-IOV.
  • bigboxes - Monday, October 17, 2016 - link

    I'm sorry. I can't see the motherboard. Where is it in the picture?
  • stardude82 - Friday, November 18, 2016 - link

    I think it's generally acknowledged now that the digital camouflage was a failure.
    https://en.wikipedia.org/wiki/MultiCam#United_Stat...
  • BrokenCrayons - Monday, October 17, 2016 - link

    Yeah, it's really off-putting to see camo. I think they're going for some kind of military/tactical thing, but Gigabyte's failed to realize that camo just makes a product look trashy and redneck to people in the US these days.

Log in

Don't have an account? Sign up now