Battery Life

At this point, it really goes without saying that battery life can make or break the experience of a smartphone. The anxiety that is associated with running out of battery is probably one of the worse experiences to have when using a smartphone, which is why good battery life is important. In theory, a phone should never run out of battery in a single day no matter the use case, but battery life is a complex issue to address. It’s common to see people assume that battery capacity and battery life are closely correlated, but this completely ignores total system power draw. Last year, one of the best examples of this was the One M8 compared against the Galaxy S5, which showed a slight edge in favor of the One M8 despite the smaller battery.

The Galaxy S6 and S6 edge are in a similarly peculiar situation. For the past few years, it has gone without saying that battery capacity would increase from year to year, but for the first time ever, Samsung has gone backwards in this regard. The Galaxy S6 has a 2550 mAh, 3.85V battery, which is 91% of the Galaxy S5’s battery. If we simply looked at this metric, it would be rather trivial to simply write off the Galaxy S6 as worse than the S5 in battery life. As previously mentioned, this is a simplistic view of the situation and ignores the other half of the battery life equation.

As a result, we must test battery life holistically, which is far from trivial in most cases. One of the first problems we can encounter in trying to test battery life is display brightness, which can significantly affect results in a battery life test. Although it’s common to standardize at 50% brightness, 50% brightness can be as low as 80 nits or as high as 250 nits depending upon the brightness curve that the OEM sets. In order to avoid this pitfall, we test battery life with the display set to have a 200 nit brightness when displaying a white image. In addition, it’s necessary to have appropriate tests that cover the full curve of performance and power, ranging from a display-bound web browsing use case to sustained, intense CPU and GPU loads.

As with most reviews, our first battery life test is the standard web browsing workload, which loads a set of webpages at a fixed interval, with sufficient time between each page load to ensure that the modem and SoC can reach an idle state. This helps to ensure that faster SoCs aren’t penalized in this test. This test doesn’t exactly match real-world web browsing patterns, but it will give a good idea of relative performance per watt at a constant level of performance.

Web Browsing Battery Life (WiFi)

In web browsing, the Galaxy S6 manages to keep battery life while on WiFi at approximately the same level as the Galaxy S5. It’s likely that a combination of the newer Broadcom BCM4358, upgraded AMOLED display, and the Exynos 7420 helped to keep battery consumption relatively constant here, which represents a 10-15% overall power efficiency increase in this test. It’s likely that we’re mostly looking at differences in display efficiency when comparing the 1440p panel of the S6 to the 1080p panel of the S5. It’s definitely impressive that Samsung has pulled this off, but I do wonder what the result would be if Samsung stayed at 1080p.

Web Browsing Battery Life (4G LTE)

On LTE, we see a pattern that seems to generally mirror devices like the iPhone 6 with an external MDM9x25 Gobi modem. The Shannon 333 modem and Samsung’s other RF front-end components seem to be competitive with Qualcomm’s implementations, but given just how close WiFi and LTE battery life was with the Snapdragon 801 generation I suspect Qualcomm still holds an edge in average RF system power. The difference isn’t massive here, so it’s possible that this could simply be the difference between an external and integrated modem, but we’ll have to do a deeper investigation on power to be sure.

While web browsing is one of the crucial use cases, battery life experiences are often different when looking at multiple points in the curve. In order to get a better idea of battery life in less display-bound use cases, we’ll look at PCMark’s Work Battery Life test. Although not a fixed workload per unit time test like our web browsing test, it avoids strongly emphasizing display power at high APL scenarios and tends to be more CPU and GPU intensive, along with more mixed APL scenarios.

PCMark - Work Battery Life

In this test, the Galaxy S6’s runtime in this battery life test is pretty close to the One M8 and One M9, but the major point of differentiation when compared to both is that the score throughout the test is significantly higher. It’s also important to note that the “battery” temperature during the test manages to stay much lower on the Galaxy S6 than on the One M9, which means that the SoC stayed in a more efficient mode of operation throughout the test as power consumption will rise with die temperature due to the way semiconductors work.

Now that we have a good idea of battery life in a display-bound and balanced workload, we can now look at SoC-bound workloads which include GFXBench and Basemark OS II. These tests are almost always limited by how much heat the case of the phone can carry away, and can often reveal weaknesses in how OEMs control throttling if a phone or tablet fails to complete either test. We’ll start by looking at GFXBench, which strongly stresses the GPU similar to an intense 3D game.

GFXBench 3.0 Battery Life

GFXBench 3.0 Performance Degradation

The Galaxy S6 ends up performing around the same level as the One M9 in terms of overall runtime, and the sustained frame rate ends up relatively similar as well. However, it’s critical to add context to this as the Galaxy S6 is running the test at 1440p, while the One M9 is rendering at 1080p. What this means is that the Mali T760 of the Galaxy S6 is sustaining a higher level of performance when compared to the Adreno 430 of the One M9 in this workload, even if that performance is “wasted” on rendering more pixels per frame. The one major issue here that is visible from the FPS vs time graph is that Samsung continues to struggle with graceful throttling as the GPU attempts to always target maximum performance, which causes a strong rise and fall in frame rate as the GPU goes through periods of high and low clock speeds determined by the thermal governor.

BaseMark OS II Battery Life

BaseMark OS II Battery Score

The final battery life test is Basemark OS II’s sustained CPU load test. Although it appears that the Galaxy S6 is comparable to the One M9 in this test, logging CPU frequencies over time reveals that the Exynos 7420 manages to keep the A57 cluster online throughout the course of the test at around 1.2 GHz, while the One M9 is forced to shut off the A57 cluster completely as the phone reaches skin temperature limits. Although both are kept at similar levels of normalized CPU load and run through the test for similar amounts of time, the Galaxy S6 manages to keep the CPU at a significantly higher performance level throughout the test. In general, it’s likely that the Exynos 7420 will be able to sustain overdrive frequencies for longer periods of time due to the massive process node advantages that come with Samsung’s 14LPE process.

Charge Time

Broadly speaking, much of the discourse around battery life as a whole has centered around time off of the charger. We can talk about how many hours of screen on time or total time a phone can spend on a battery, but charging time is often a critical to maintaining mobility. Removable batteries might be able to help with this problem, but if it’s easy to forget to charge a phone overnight, it’s just as easy to forget to charge a spare battery. Charge rate is often crucial for this reason, which is why we attempt to test it. In order to test this, we measure the time it takes to charge from a fully discharged battery to 100% either measured at the wall or indicated by a charging LED. The Galaxy S6 retains the same fast charge protocol as the Note 4, which seems to be QC 2.0 as the AC adapter negotiates fast charging with phones like the LG G Flex 2 and One M8.

Charge Time

When using the included USB charger, the Galaxy S6 charges incredibly quickly. However, the wireless charger is noticeably slower than the wired charger, which is due to inefficiencies associated with wireless charging and the rather limited charge speed, which is appears to be limited to 1.5 amps at 5 volts. It’s a bit surprising to see that there is no option to disable fast charging the way there was with the Note 4 given that the battery is now sealed and rather difficult to replace, but I suspect most won’t notice a difference in lifetime unless the phone is used for more than 2-3 years.

Exynos 7420: First 14nm Silicon In A Smartphone Display
Comments Locked

306 Comments

View All Comments

  • Andrei Frumusanu - Friday, April 17, 2015 - link

    Math is hard, corrected, thank you.
  • Arbie - Friday, April 17, 2015 - link

    SD Police here: no microSD, no sale. The reasons have been hashed over endlessly but I know what I want.
  • mayankleoboy1 - Friday, April 17, 2015 - link

    Im very puzzled by the large differences between the stock browser and Chrome. They both are based on the Blink engine, and use V8 for JavaScript execution.
    This definitely points to "optimizations" done in the stock browser for these benchmarks.
    Could you do some other benchmarks on the phones?
  • JoshHo - Friday, April 17, 2015 - link

    Basemark OS II and PCMark use the internal WebView engine and the 7420 doesn't do nearly as poorly in those browsing benchmarks as it does on Chrome.

    It's likely that Samsung Mobile has some work to do when it comes to optimizing against Chrome.
  • lilmoe - Saturday, April 18, 2015 - link

    Or, it's the other way around. Google needs to do a LOT of work of optimizing Chrome for the various hardware out there, especially the most popular ones.
    Chrome isn't getting the highest marks in optimization you know, especially on the desktop. I thought that was a well known and understood issue?
  • Bob-o - Saturday, April 18, 2015 - link

    Can someone explain to me why an application needs to be optimized for certain hardware? Isn't it just using libraries for rendering (OGL, whatever), and those libs have already been optimized for the GPU? And the non-rendering part of the app should be byte-compiled appropriately?

    Back in the 1980's I used to optimize apps for certain hardware. . . in assembly code. What are they doing these days? And why is it necessary? Poor abstractions?
  • mayankleoboy1 - Sunday, April 19, 2015 - link

    These optimizations are not for specific hardware, but for the specific BENCHMARK. They can easily tweak parameters inside the Javascript engine to give higher score on specific benchmarks like Octane and Kraken. These optimizations would negatively affect the common web JS workloads, but would give higher benchmark score.
    Google/Mozilla wouldnt do such shenanigans as they do not priortize for specific benchmark, unless it also improves general JS workloads
  • bji - Friday, April 17, 2015 - link

    I have a big problem with the way their camera module juts out from the back of the device. I have a Galaxy S5 Active (my first smart phone) and the camera broke within about 2 months of ownership. I believe it's because it juts out and is a focal point of stresses as a result (pressure while in pocket, pressure when laid on a flat surface, etc), and the very weak glass they use to cover the lense is subject to breaking. I've read many comments from others that this happened to them, and it happened to me. Now the camera is useless.

    I could put a big ugly case on the thing to protect the camera, sure, but that's why I bought the Active - because I didn't want to put a case on my phone.

    I see that Samsung continues with this horrid camera module design. I won't be buying another Samsung with this characteristic.
  • name99 - Friday, April 17, 2015 - link

    I suspect Samsung would do well to copy Apple in one more respect --- making cases a big part of the user experience.
    Something that critics of the iPhone 6 (in particular the "slippery sides" and "too much sacrificed for thinness" don't seem to get is that, IMHO, Apple sees cases as a significant part of the iPhone experience. Which is why they provide their own --- expensive but very nice --- high end cases, and are willing to accept the inevitable leaks we see from case makers in advance of new products.

    Once you accept that a case is part of the story
    - the thinness makes more sense, because you're going to be adding a few mm via the case
    - likewise the camera bulge, while less than ideal, is not such a big issue
    - likewise complaints about the fragility of glass backs, etc.
    Cases also allow for a dramatic level of customization without Apple having to stock a zillion SKUs. You could even argue that the aWatch band proliferation is Apple having learned from the size of the case market for iPhones and iPads, and arranging things so that they get the bulk of the high-end money that's available in this space.

    Every other phone manufacturer is in a much weaker position than Apple because they don't have the massive range of cases available. But they could at least try to improve the situation by providing their own cases --- maybe at least a high end leather model, a low-end plastic model, and an "I'm paranoid I'm going to drop my phone" model. They should also call out the cases during the big press reveal of each phone (like Apple does) and ship some cases along with each review unit (not sure if Apple does this, but they should).

    All of which makes the Edge, IMHO, even more of a gimmick (in spite of Samsung claiming they will no longer do gimmicks). You get a much more expensive manufacturing process to provide something whose real functionality could probably be provided with a few colored LEDs, and you dramatically reduce the design space available for cases.

    Oh well. Stay tuned for the next Samsung model which (don't tell me, let me guess) will feature as its big new feature a haptic (don't call it Taptic!) engine and which, with any luck, will manage to ship in at least one country before the iPhone 6S, so that Samsung can claim (and have the true believers accept) that this was their plan all along, that they were in no way influenced by Apple's obvious [based on aWatch and MacBook] next big UI element.
  • akdj - Sunday, April 26, 2015 - link

    Hi name99. Wish there was an up vote;). Well said. As an owner of the iPhone 6+ (& each iteration before it), I've 'finally' found the Apple iPhone case:). Lol. I bought some Platinum Incipio Pro kickstand crap, a really lame Spec case (& I love their laptop shells on my MBP) before I finally made a trip down to the Apple Store and picked up the simple, brown leather iPhone 'Apple' case (I don't remember it being expensive though, seems like 39, maybe 49 bucks? Seems like the standard pricing regardless of manufacturer out of the gate).
    I'm embarrassed to say since 2007, I've never had the Apple case. Always bought third party and typically Mophies starting with the iPhone 4/4s.

    Sorry, TL/dr -- not in defense of Android OEM lack of third party peripherals as its true but this last year, 18 months has changed some. The S-View case specific to the 's' and 'note' brands are pretty sweet. I use one on my Note 4 and like the Apple cam/case combo the S-Case also protects the camera protrusion while adding even more functionality. It's magnet sensing for turn on/off by open/close and the small maybe 2" x 2" 'S-View' (small window on front) allows answering of calls, quick text/tweet/FB/email/whatever-u-set-up response capability, notifications and time (customize faces and information on clock), weather and 'maps', settings, and more. It's slick and it's protective.

    But you're right. The Apple iOS cases kick ass. I own the 'smart' cases (not covers, they suck) on our iPads too. Be nice if they quit changing the dimensions ever so slightly each iteration ala iPhone. Usually get two generations of the iPhone outta one case. Single on an iPad. Oh well. Keep em longer too I suppose).

    Good to see another avid iOS user. I love both and have since the original 4GB, non subsidized $500 2G iPhone and the Xoom/S1 ...and to date I'm undecided. Don't play with the new Amdroids. They're very nice as well. It's too dangerous now with AT&T/Verizon, even Best Buy, etc. just pick what you want. The color. The capacity. No money down and NEXT fools ya. Before you know it, you've got iPads for everyone in the family. A pair of Nexus 7s you're trying to figure out what to do with, iPhones and Notes... Just 'try' the M9, or the G3/(4 coming?) what the heck, can't hurt. Before you know it you've got a dozen devices all accessing your data, exponentially increases bandwidth used on wifi and LTE for updates and the ilk. And a $700 'phone' bill. Lol. Too cool.

    Does t matter which way you go, iPhone 5s/6/6+ or S5/Note4/G3/M8 or 9, Note 4 or this bad boy. They're ALL 'computers' in our pocket. Across the board faster and more energy efficient than computers we used last decade. The storage. The connectivity. The processing and RAM, controllers (micro); accelerometer, barometer, proximity and Bluetooth 4.1, wireless AC and 2x2 antenna arrangement ...without... An antenna ( those of us in our forties, probably mid to late thirties remember those, right? ...other than the sweet 'bands' on my 6+;) course hidden by earlier do dissed Apple's iPhone case. iPad cases. They're sweet. Kinda like their trackpads in comparison to EVERY other OEM. They work. All. The. Time. They NEVER don't. WTH can't Windows get an OEM partner to nail the trackpad? Perhaps that's why they decided on 'touch'? :-)

Log in

Don't have an account? Sign up now