Idle Power

In all of these tests you're going to see three charts. The first will show you total platform power, measured at the battery, taking into account everything from SoC to display. The next shows you power measured at the CPU power delivery circuit, and the third shows you power measured at the GPU power delivery circuit. All values are measured in watts, and are reported in 15ms intervals (although I sampled at 1KHz then averaged down to 15ms).

For our first set of tests I simply wanted to get a feel for idle power. Both systems had all background syncing suspended, WiFi was connected, and we're just sitting at the Windows RT/8 Start Screen until the tablets reached a truly idle state. Note that idle under Windows RT/8 technically doesn't happen until the live tiles stop updating, which you'll see denoted by a drop in the idle power consumption in the graphs below.

First up is total platform power consumption:

Surface RT has higher idle power, around 28% on average, compared to Acer's W510. The last half of the graph shows the tablets hitting true idle when the live tiles stop animating.

A look at the CPU chart gives us some more granularity, with Tegra 3 ramping up to higher peak power consumption during all of the periods of activity. Here the Atom Z2760 cores average 36.4mW at idle compared to 70.2mW for Tegra 3.

The GPU specific data is pretty interesting - the GPU power rail shows much high power consumption than on Intel's Z2760. As I didn't design Tegra 3, I don't know what else is powered by this rail - although you'd assume that anything else not in use would be power gated. Imagination Technologies' PowerVR SGX 545 does appear to be quite power efficient here, on average using 155mW while rendering the Start Screen.

I wasn't happy with the peaks we were seeing when nothing was happening on the systems, so to confirm that nothing funny was going on I threw both tablets into airplane mode and waited for full idle. Check out the tail end of the platform power diagram:

 

That's much better. Without the AP talking to each tablet's WiFi radio constantly, idle becomes truly idle. If you're curious, the power savings are around 47.8mW (average) for the W510 in airplane mode when fully idle.

The GPU rail feeding the Atom Z2760 appears to hit a lower idle power when compared to NVIDIA's Tegra 3. Advantages in idle power consumption are key to delivering good battery life overall.

Introduction OS & App Launch Power
Comments Locked

163 Comments

View All Comments

  • mrdude - Friday, December 28, 2012 - link

    On the same Clover Trail Atom that has trouble keeping a fluid scrolling motion in Metro? The same Atom that's only 32-bit with 2GB RAM on the devices? The same Atom that can't run your legacy productivity software any better than the old netbooks?

    The x86 legacy requirements are going to be significantly higher than the ARM parts due to the software the x86 chips are required to run. If you can't run CAD or Photoshop or Blender, or any other useful productivity application on your tablet/notebook, then you likely don't need x86 in the first place. All of the other applications that don't require that much horsepower already have alternatives, and often better alternatives, in the iOS and Android app stores.

    If I can't run the same games and demanding software on an x86 tablet, then do I really need an x86 tablet?

    That's the dilemma Intel and Microsoft both face. Currently, the sales figures of these x86 tablets are less than 1% of all Win8 devices thus far sold. People aren't going to pay a higher price just because it says Microsoft or Intel on it. Given the robustness of Google Play and iOS App stores and the market penetration of those respective devices, the majority of the same crowd that's buying tablets is likely to be just as familiar with Android and iOS as they are with Windows. In fact, perhaps even more so given the dramatic UI changes to Win8. If people need Office they won't have to wait long because Microsoft announced it was offering an Android and iOS version of Microsoft Office in the Spring of 2013.

    x86 compatibility looks great until you realize just how much you're missing out on with respect to the Android and iOS app stores. And if I'm going to buy a tablet for work then I'm sure as hell going to demand that it actually has enough power to run my software. Otherwise what's the point? A weak x86 tablet that can't run productivity software and games is just a tablet that's missing out on both ends.
  • CeriseCogburn - Friday, January 25, 2013 - link

    Worse than that, who wants to switch to the awful faded out rectal-boxed sick pastels of msft 8 with their crappy harder to identify the "icon box" from heck UI?
  • pugster - Thursday, December 27, 2012 - link

    Intel probably got their best cpu against ARM's worst cpu in an operating system that might not be optimized for ARM. I personally would like to see if Intel would start outfitting these cpu's in android tablets and with more optimized arm cpu's like cortex a7/15.
  • ddriver - Friday, December 28, 2012 - link

    I bet Intel ordered and paid for this article, potentially providing guidelines how and against what product to be tested.

    It is shameful for Intel to miss out on mobile platforms. Surely, low profit margin is not Intel's territory to begin with, but still, the rise of popularity of ARM devices represents a substantial blow to the reputation and monopoly of Intel - the world has seen there is more to the market than Intel (and poor perpetual runner up AMD). It is a long term threat, as indirect as it may be.

    A15 is pretty much here and a dual A15 is about twice as fast as a quad A9. The Atom is competitive to current aging ARM designs, but it will slip back as soon as A15 designs flood the market.

    x86 and CISC in general are something that should have died out a long time ago, but its life was artificially prolonged, because no one in the industry really gives a damn about efficiency. Efficiency is the enemy of profit, and profit is the sole motivation for everything being done nowadays.

    Don't get me wrong, the overheads of CISC are insignificant, and Intel will probably be able to make up for it with better in-house manufacturing process. And with such a hefty profit margin on their middle to high end, they can afford to give low power CPUs for free to big players, just to kill a potential long term threat to their business. It won't be the first time Intel will pay to cripple its competition.

    x86 is crippled by licensing, while anyone is free to license ARM and do his own spin - big companies like Samsung have the capacity to design and manufacture an ARM based CPU according to their needs, which is something Intel won't be bothered with. Selling entire early batches exclussively to Apple to put in their useless toys first is an entirely different matter from doing customizations to the actual hardware and production lines just to please a client.

    The sooner the world moves away from x86 the sooner we will see some actual competition, instead of this MONOPOLY in the disguise of a duopoly we have today. I do run an Intel rig, because I need the performance, but I am no fan of Intel, or of the amount of money I had to pay for it because Intel has no competition and does whatever it wants. I'd happily watch the company burn to the ground, the world doesn't need a greedy monopolistic entity - it is that entity that needs the world to suck the wealth out of it.
  • nofumble62 - Saturday, December 29, 2012 - link

    skip that software incompatibility nonsense.
  • ddriver - Saturday, December 29, 2012 - link

    Due to the horrendous (at least compared to ARM cores) power consumption, this is only possible with a significantly big battery, which would make the tablet experience almost the same as if were a stone tablet.

    Software incompatibility - you can thank Microsoft for this, they have pushed their vendor and platform limited development tools for so long. That is why I ditched any MS related technology, I still use windows, but I don't code against the MS APIs, instead I use Qt - the same code runs on Windows, Mac, Linux and very soon iOS and Android will be officially supported too (right now those are only unofficially supported).

    Big companies like Adobe and Autodesk have already embraced Qt in their workflow, but it will still take some time to shake off the MS legacy crap from their code base.

    Sure, you can go for something lame and inefficient like Java or HTML+JavaScript, but nothing beats the performance and memory efficiency of native machine code.
  • Pazz - Saturday, December 29, 2012 - link

    The fundamental point which should be highlighted throughout this entire analysis is that Clovertrail is significantly newer tech than Tegra 3.

    The Tegra 3 inside the MS Surface had availability as soon as Q4 2011.

    Microsofts choice to implement the Tegra 3 SOC was a matter of timing. It was the best available to the Surface team at the time. The extensive development of a new product, particularly as important as Surface given the current market and timing with Win8, always tends to result in older tech being included. More time is invested in other non-SOC specific areas.
  • theSuede - Sunday, December 30, 2012 - link

    A very well executed run-through, but:
    Wouldnt it be possible to do a measurement sample floating average over five samples or something? The charts are close to unreadable, and the graphical presentation fools the eye into averaging sample points incorrectly.
    Spikes in graphs are only averageable by human vision in point charts, in line charts the eye puts far to much weight on local deviations.

    The same goes for most storage/HD performance graphs at AnandTech. Just my 5c on statistics presentation interpretation...
  • casper.bang - Tuesday, January 1, 2013 - link

    I'm a little confused about what this article is trying to tell; looks to me as if Anand is comparing next-gen Intel Atom with current-gen ARM A9 and by doing so arrives at the *shocking* conclusion that the former is more performance/power efficient... and adds a sensational headline to top it off?!

    I honestly expected a bit more; say focus on Cortex A15 vs. Hashwell. I'm also surprised at the lack of a discussion about how Intel is possibly lowering wattage, considering they are infamous for solving problems in the past decade by throwing on clunky power-hungry caches.

    Clearly Intel has some brilliant engineers and world-class manufacturing, but should we not wait to compare apples to apples before declaring that "x86's high power-consumption is a myth"? Come again when routers, picture frames, media centers, tablets, phones etc. based on x86 is mainstream rather than a wet dream in Otellini's head.
  • Kidster3001 - Friday, January 4, 2013 - link

    You mean the 5 year old Clovertrail core design... designed before Tegra3? THAT next-gen Atom?

Log in

Don't have an account? Sign up now