While this has always been an issue that’s been in the background since Android OEMs started releasing devices with display PPIs above the 300-400 “retina” range, recent events have sparked a broader discussion into the value of pursuing the PPI race that is happening between Android OEMs. Within this discussion, the key points of contention tend to center upon the various tradeoffs from increasing resolution, and whether an increase in pixels per inch (PPI) will actually have a perceivable increase.

If there is any single number that people point to for resolution, it is the 1 arcminute value that Apple uses to indicate a “Retina Display”. This number corresponds to around 300 PPI for a display that is at 10-12 inches from the eye. In other words, this is about 60 pixels per degree (PPD). Pixels per degree is a way accounting for both distance from the display and the resolution of the display, which means that all the information here is not limited to smartphone displays, and applies universally to any type of display. While this is a generally reasonable value to work with, the complexity of the human eye and the brain in regards to image perception makes such a number rather nebulous. For example, human vision systems are able to determine whether two lines are aligned extremely well, with a resolution around two arcseconds. This translates into an effective 1800 PPD. For reference, a 5” display with a 2560x1440 resolution would only have 123 PPD. Further muddying the waters, the theoretical ideal resolution of the eye is somewhere around .4 arcminutes, or 150 PPD. Finally, the minimal separable acuity, or the smallest separation at which two lines can be perceived as two distinct lines, is around .5 arcmin under ideal laboratory conditions, or 120 PPD. While all these values for resolution seem to contradict each other in one way or another, the explanation behind all of this is that the brain is responsible for interpretation of the received image. This means that while the difference in angular size may be far below what is possible to unambiguously interpret by the eye, the brain is able to do a level of interpolation to accurately determine the position of the object in question. This is self-evident, as the brain is constantly processing vision, which is illustrated most strikingly in cases such as using a flashlight to alter the shadowing of blood vessels that are on top of the retina. Such occlusions, along with various types of optical aberration and other defects present in the image formed on the retina are processed away by the brain to present a clean image as the end result.

Snellen chart used to test eyesight. The width of the lines determines the angle subtended.

While all of these resolution values are achievable by human vision, in practice, such values are highly unlikely. The Snellen eye test as seen above, is the well-known chart of various lines of high contrast text with increasingly small size, gives a reasonable value of around 1 arcminute, or 60 PPD for adults, and around .8 arcminutes for children, or 75 PPD. It's also well worth noting that these tests are all conducted under ideal conditions with high contrast, well-lit rooms.

So after going through these possible resolutions, the most reasonable upper bound for human vision is the .5 arcminutes value, as while there is a clear increase in detail going from ~300 PPI to ~400 PPI in mobile displays, it is highly unlikely that any display manufacturer can make a relatively large display with a resolution that corresponds to 1800 PPD at 12 inches away for mass production.  However, for the .5 arcminute value, at a distance 12 inches away from the eye, this would mean a pixel density of around 600 PPI. Of course, there would be no debate if it was that easy to reach an answer. Realistically, humans seem to only be able to have a practical resolution of around .8 to 1 arcminute. So while getting to 600 PPI would mean near zero noticeable pixelation for the vast majority of edge cases, the returns are diminishing after passing the 1 arcminute point. For smartphones around the display size of 4.7 to 5 inches in diagonal length, this effectively frames the argument around the choice of a few reasonable display resolutions with PPI ranging from 300 to 600. For both OLED and LCD displays, pushing higher pixel densities incurs a cost in the form of greater power consumption for a given luminance value. Going from around 330 PPI to 470 PPI for an LCD IPS display incurs around a 20% power draw increase on the display, which can be offset by more efficient SoC, larger batteries, improved RF subsystem power draw. Such power draw increases can also be offset by improvements in the panel technology used, which has consistently been the case with Samsung’s OLED development but regardless of these improvements, it is an increase to power draw compared to an equivalent technology display with lower pixel density. In the case of LCD displays, a stronger backlight must be used as the higher pixel density means the transistors around the liquid crystal become a larger proportion of the display, and the same is also true of OLED panels, but instead the issue becomes that smaller portions of organic phosphors on the display have to be driven at higher voltages in order to maintain the same level of luminance. An example of this can be seen below with the photo of the LCD display with its transistors, with the second showing a front-lit shot to illuminate the TFTs.

Example photo of the TFTs in a display, contrasted with the luminous area

Thus, there are multiple sets of tradeoffs that come with increased resolution. While getting closer to the 0.5 arcminute value means getting closer to nearly unperceivable pixels, there is a loss in power efficiency, and on the same note, a loss in peak luminance for a given level of power consumption which implies reduced outdoor visibility if an OEM artificially clamps the upper bound of display brightness. With the focus on resolution, it also means that the increased cost associated with producing higher resolution displays may be offset elsewhere as it’s much harder to market lower reflectance, higher color accuracy, and other aspects of display performance that require a more nuanced understanding of the underlying technology. Higher resolution also means a greater processing load to the SoC, and as a result, UI fluidity can be greatly affected by an insufficient GPU, and a greater need to leverage the GPU for drawing operations can also reduce battery life in the long run.

Of course, reaching 120 PPD may be completely doable with little sacrifice in any other aspect of a device, but the closer OEMs get to that value, the less likely it is that anyone will be able to distinguish a change in resolution between a higher pixel density and lower pixel density display, and diminishing returns definitely set in after the 60 PPD point. The real question is what point between 60 and 120 PPD is the right place to stop. Current 1080p smartphones are at the 90-100 PPD mark, and it seems likely that staying at that mark could be the right compromise to make.

An example of a display with RGB stripe, 468 PPI.

But all of this assumes that the display uses an RGB stripe as seen above, and with Samsung’s various subpixel layouts used to deal with the idiosyncrasies of the organic phosphors such as uneven aging of the different red, green and blue subpixels, as blue ages the fastest, followed by green and red. This is most obvious on well-used demo units, as the extensive runtime can show how white point drops dramatically if the display is used for the equivalent of the service lifetime of the smartphone. For an RGBG pixel layout as seen below, this means that a theoretical display of 2560x1440 resolution with a diagonal length of 5 inches would only give 415.4 SPPI for the red and blue subpixels, and only green subpixels would actually have the 587 SPPI value. While the higher number of green subpixels is a way of hiding the lower resolution due to the human eye’s greater sensitivity to wavelengths that correspond to a green color, it is without question that it is still possible to notice the different subpixel pattern, and the edge of high contrast detail is often where such issues are most visible. Therefore, in order to reach the 587 SPPI mark for the red and blue subpixels, a resolution of around 3616x2034 would be needed to actually get to the level of acuity required. This would mean a PPI of 881. Clearly, at such great resolutions, achieving the necessary SPPI with RGBG pixel layouts would effectively be untenable with any SoC that should be launching in 2014, possibly even 2015.

Example of a non-RGB layout. Note the larger area blue pixels due to their square shape.

While going as far as possible in PPD makes sense for applications where power is no object, the mobile space is strongly driven by power efficiency and a need to balance both performance and power efficiency, and when display is the single largest consumer of battery in any smartphone, it seems to be the most obvious place to focus on for battery life gains. While 1440p will undoubtedly make sense for certain cases, it seems hard to justify such a high resolution within the confines of a phone, and that’s before 4K displays come into the equation. While no one can really say that reaching 600 PPI is purely for the sake of marketing, going any further is almost guaranteed to be for marketing purposes.

Source: Capability of the Human Visual System



View All Comments

  • BMNify - Monday, February 10, 2014 - link

    Joshua, why didn't you mention the fact that UHD-1 and later UHD-2 is in fact more than just pixel density , its about providing the consumer with real Rec. 2020 10bit real color space rather than the antiquated Rec. 709 (HDTV and below)8 bit pseudocolor of today's panels .
    "Rec. 2020 allows for RGB and YCbCr signal formats with 4:4:4, 4:2:2, and 4:2:0 chroma subsampling.[1] Rec. 2020 specifies that if a luma (Y') signal is made that it uses the R’G’B’ coefficients 0.2627 for red, 0.6780 for green, and 0.0593 for blue.[1]"

    as regards using more power that's what the latest quantum dot display is for, as in, that light is supplied on demand, which enables new, more efficient displays and allows for mobile devices with longer battery lives.

    "Colloidal quantum dots irradiated with a UV light. Different sized quantum dots emit different color light due to quantum confinement."
  • JoshHo - Monday, February 10, 2014 - link

    That's not so much about pixel density as it is about 24-bit vs 30-bit color. I'm not too well educated on HDTV spec, but sRGB is 24-bit and is the industry standard, while Adobe RGB is 30-bit but is effectively limited to content creation purposes, as only applications like Photoshop are really aware of colorspace. Reply
  • JoshHo - Monday, February 10, 2014 - link

    I think I conflated color depth with color space, so my foot is currently firmly in my mouth, they are independent of each other.

    But for the most part, sRGB is the standard, and within that limited gamut, 8-bit color is the standard. In the future 10-bit color or greater will happen, but it doesn't seem likely for now.
  • BMNify - Monday, February 10, 2014 - link

    :) thats ok,just remember to always ask and answer the question on all future reviews , "is this [UHD] panel International Telecommunication Union (ITU) Rec. 2020 10bit real color space ready out the box" Reply
  • JoshHo - Monday, February 10, 2014 - link

    For sure, but I think that mostly applies to TV applications. For smartphone and desktop displays, it seems that sRGB will remain the standard, although there may be some movement towards wider color gamuts some day. Reply
  • BMNify - Monday, February 10, 2014 - link

    oh and just to save looking it up
    "In coverage of the CIE 1931 color space
    the Rec. 2020 color space covers 75.8%,
    the digital cinema reference projector color space covers 53.6%,
    the Adobe RGB color space covers 52.1%,
    and the Rec. 709 color space covers 35.9%"

    "The NHK [and the BBC] measured contrast sensitivity for the Rec. 2020 color space using Barten's equation which had previously been used to determine the bit depth for digital cinema.
    11-bits per sample for the Rec. 2020 color space is below the visual modulation threshold, the ability to discern a one value difference in luminance, for the entire luminance range.
    The NHK [and the BBC] is planning for their UHDTV system, Super Hi-Vision [UHD-2] , to use 12-bits per sample RGB"
  • MrSpadge - Monday, February 10, 2014 - link

    Higher resolution always reduces power efficiency of the system. As the article says: if there are other improvements to power efficiency to offset this loss, they could always be used to reduce power at lower resolutions. Our choice.

    Regarding quantum dots: it's hard to pump them electrically, quantum wells are much better than this. But all of the pretty much require expensive III-V or II-VI compound semiconductors (like LEDs and many lasers), which doesn't bode well for large-area applications. That's why they're going for the OLEDs instead.

    And about pumping them optically: well, you don't want to have to put on sun screen in order to look at your phone for longer than a few minutes, do you? Anyway, UV light sources are neither cheap nor efficient. A cure worse than the desease, so to say.
  • victorson - Monday, February 10, 2014 - link

    Good to see some deeper analysis on display resolution and congratulations for this well-written article. The Snellen eye test is probably a good enough measure for a Westerner, but people tend to forget there are other languages out there. Namely, Chinese and Japanese (and I would guess the arabic languages) readers are the once who benefit the most from higher pixel densities, as complex Chinese characters can be an extremely complicated drawing that fits in the space of almost a single letter. My guess would be that pixel densities over and around 500ppi would actually make for a tangible improvement in the reading experience, but it'd be interesting to see more reasearch on this. Reply
  • jerrylzy - Monday, February 10, 2014 - link

    While I am interested in how a 600-PPI display would look like, I tend to favor a phone with lower resolution and PPI. Since the difference may not be very noticeable, I weigh more on battery life and overall smoothness which a higher resolution display would definitely impact. Reply
  • piroroadkill - Tuesday, February 11, 2014 - link

    I agree. I'm loving the fact Moto X came with a 1280x720 screen, an the Xperia Z1 compact too.

    Let's bring some goddamn sanity back into things.

    Lower cost and higher battery life is what I care about over jamming my retinas against the glass and complaining if I can see a discernible element.

Log in

Don't have an account? Sign up now