Gaming Performance

This is the real measure by which the Alienware X51 will be judged. Alienware seems to have tried to cram as much GPU power as possible into the X51, but I do have to wonder if AMD's (admittedly now outdated) Radeon HD 6850 might not have been a better choice. The HD 6850 certainly fits well within the X51's power envelope, but then again, the end user would lose the benefit of NVIDIA's Optimus power-saving technology. That said, the benefits of that technology on the desktop are a little bit foggier than they are in a notebook. It's really a tough call either way, depending upon your usage model.

Where our charts unfortunately are going to come up a bit short are in comparisons. Because we recently revamped our gaming benchmark suite, there's only one system we can compare the X51 to: the recently-reviewed AVADirect Silent Gaming PC. That isn't necessarily a fair fight, either, with AVADirect's system costing 2.5 times as much and enjoying a GeForce GTX 580. Still, one data point is better than none, but try to maintain perspective: Alienware is targeting 1080p gaming, but our desktop gaming suite is brutal, and the GTX 580 is roughly twice as powerful in hardware.

In case you missed it, note that again we're using the same selection of games as our laptop reviews, only we're running 1080p using our Mainstream and Enthusiast settings. Since we don't have all of the previously reviewed systems available, we've included the only 1080p Mainstream results we have right now in one chart.

Batman: Arkham City

Battlefield 3

DiRT 3

The Elder Scrolls V: Skyrim

Portal 2

Total War: Shogun 2

What's impressive is that the X51 actually posts fairly strong numbers in our gaming testing. Anti-aliasing is going to be out of the question in many cases (e.g. the Enthusiast results), but Alienware seems to have largely been successful in achieving what they set out to achieve: the GTX 555 version can definitely handle our Mainstream 1080p gaming suite.

Where things do get a little foggier is the Optimus support. Total War: Shogun 2 flat out refused to run while the IGP was enabled; we had to connect the monitor directly to the video card instead of the IGP's HDMI port to get the game to work. This bug was reported to NVIDIA and Alienware, and since the X51 uses standard NVIDIA drivers this should hopefully be fixed in the future.

We also discovered a minor hiccup in our testing suite involving the monitor we use for testing desktops: the Acer HN274H has a bug where it can incorrectly report the resolution it supports, regardless of HDMI, DVI, or VGA connection, and this bug reared its ugly head in Civilization V testing on both the X51 and on the AVADirect system. Unfortunately, despite working with NVIDIA on the issue, we didn't figure out it was the monitor until the X51 exhibited the same issue (refusing to benchmark at 1080p and knocking the resolution down to 1680x1050), so we don't have results for AVADirect's tower. That said, the X51 was able to produce over 30fps in Civilization V (34.7 at Enthusiast and 43.6 at Mainstream 1080p to be exact.)

System Performance Build, Heat, and Power Consumption
POST A COMMENT

59 Comments

View All Comments

  • Anonymous Blowhard - Friday, February 17, 2012 - link

    "With such a compact design one would expect the X51 to be both loud and hot, but surprisingly this isn't the case. Quite the opposite actually; the X51 is cooler and quieter at both idle and load than the first-generation Xbox 360 was."

    I'm pretty sure I've heard quieter power tools than a first-gen 360. That's not exactly shooting for the moon there.

    How far away is that 40dB measurement being taken from? This makes the difference between "gaming capable HTPC" and "banned from the living room."
    Reply
  • haukionkannel - Friday, February 17, 2012 - link

    This is something like a paragon of "the best you can get" when thinking next generation consoles.
    The consoles are most propably even more cripled by power consumption and this would be too expensive, so they would reguire allso cheaper parts...
    Nice to see when xbox 720 comes out how it would compare to this...
    Reply
  • A5 - Saturday, February 18, 2012 - link

    Take this and replace the GPU with something with DX11.1 support and similar thermals (a 6850 with DX11.1 features added seems reasonable instead of a 7770), and you're probably in the ballpark.

    Good-looking console games come from the incredible amount of optimization possible due to a single hardware configuration, not from the power of the hardware.
    Reply
  • A5 - Saturday, February 18, 2012 - link

    You'd also replace the CPU with some kind of PPC variant if the rumors are to be believed. Reply
  • tipoo - Saturday, February 18, 2012 - link

    The first revision 360 had a 200W maximum power draw, this has a 172W draw. I think they could do it, but I think Microsoft at least, and probably Sony too, will re-think the selling for a loss strategy this round as it took them a looong time to recoup losses. There's a rumor the Nextbox will use a 6670-like card, but I think (and hope) that is false, as the original 360 dev kits used an old x800 graphics card before they finally came with the x1900-like chip in the 360. Reply
  • Traciatim - Friday, February 17, 2012 - link

    It's really unfortunate that you couldn't have done the gaming benchmarks with the I3, i5, and i7 models to see how much of a difference each step makes in a variety of games. Reply
  • Wolfpup - Friday, February 17, 2012 - link

    The answer is power gating, not switchable graphics. Until we have that better, we need the GPU acting as a GPU.

    These articles keep acting like it's fine, and in practice, it's one person after another getting blue screens, driver weirdness, difficulty installing Nvidia or AMD's drivers, etc., that you just don't see on most systems without switchable graphics.

    Articles like this that keep promoting it have casual users trying to buy stuff confused, when you've got 10 people on a forum trying to talk them out of it.

    I'm used to Anandtech being dead on with everything, so this Optimus push of the last few years is BIZARRE.
    Reply
  • TrackSmart - Friday, February 17, 2012 - link

    Switchable graphics makes a lot of sense for a mobile system, where an extra couple of watts of power draw can mean an extra hour or two of battery life. I'm already amazed at how little energy *very powerful* modern graphics cards use when idling. How much lower do you think they can realistically go? Until they can get within range of their mobile parts at idle, switchable graphics will continue to be a compelling feature for keeping laptops running longer.

    If you are talking specifically about desktop computers, then I agree that the benefits are minimal. Aside for access to Quick Sync for those few people who would use it.
    Reply
  • JarredWalton - Friday, February 17, 2012 - link

    "...in practice, it's one person after another getting blue screens, driver weirdness, difficulty installing Nvidia or AMD's drivers, etc., that you just don't see on most systems without switchable graphics..."

    I disagree. I've had very few BSODs, taking all of the Optimus laptops I've tested/used together over the past few years. I'm sure there are probably exceptions, but certainly within the last 18 months I've had no complaints that I can think of with Optimus on my personal laptops.

    I don't think Optimus fills a major need for a desktop, but posts like yours claiming that Optimus is essentially driver hell and problems are, in my experience, the rantings of someone who either had one bad experience or simply hasn't used it.

    But let's put it another way: what specific laptops have you used/tested with Opitmus where there were clear problems with Optimus working properly, where drivers couldn't be updated, etc.?
    Reply
  • TrackSmart - Friday, February 17, 2012 - link

    Gamers are the target audience, yet a marginally bigger case would have allowed for a more powerful GPU. Or a similarly powerful GPU for a lot less money. This is not a mobile system where every square cm of space counts, so why force the consumer to make such large compromises in price:performance?

    Obviously I'm not the target audience. Just like I will never own an "all in one" desktop computer that has the performance of a laptop. It just doesn't make sense unless you have absurd space limitations.
    Reply

Log in

Don't have an account? Sign up now