Retina Display

The Retina Display used in Apple’s iPhone 4 is claimed to be the sharpest, most vibrant, highest-resolution smartphone display currently on the market. The pixels are just 78 micrometers wide and there are 614,400 (960×640) of them packed into a 3.5-inch In-Plane Switching (IPS) TFT LCD. The resolution is 326 pixels per inch (PPI). IPS display technology features wide viewing angles up to almost 180 degrees and very low shifts in color and contrast and angles. The high resolution Retina Display used in the iPhone 4 makes text, photos and videos look significantly better than previous iPhones as well as all other smartphones on the market today.

INFERIORITY COMPLEX: Samsung disagrees and points to “more accurate colors” of all things, according to Electronista. There are at least two reasons why Samsung opted to purchase Clairvoyante to gain its pioneering work on non-RGB sub-pixel structured displays called PenTile Matrix (read Nexus One PenTile Matrix OLED Display). First, OLED displays consume considerable power when the pixels are not turned off. Only when the pixels are off (black) or near black do OLED displays consume less power than LCDs. The brighter the pixels get the more power they consume, and more power is consumed with every additional pixel in an OLED display. Second, it is extremely difficult (impossible?) to pack the same number of pixels into an equivalent-sized OLED display relative to LCD technology. Clairvoyante’s PenTile Matrix comes very close to completely solving both challenges. Please refer to a series of reports collaborating with DisplayMate that took a very close look at the AMOLED display used in the Nexus One (read Display Showdown Part Ia: Nexus One). The result: the Nexus One exhibited colors that were blown out and terribly inaccurate. I wonder where Samsung got its assumption that its PenTile Matrix OLED displays in smartphones were color accurate. For a real accurate apples-to-apples comparison I would like to challenge Samsung in volume producing a 3.5-inch RGB sub-pixel structured AMOLED sporting a pixel format of 960×640. My guess is that it is an impossibility at the moment. Back to the Retina Display.

LIMIT TO HUMAN VISUAL ACUITY: There seems to be some questions about Steve Jobs’ remarks regarding the 300 PPI limit to our visual system. This is what Steve Jobs said about Apple’s Retina display during his keynote at WWDC 2010 at around the 36 minute mark:

It turns out that there’s a magic number right around 300 pixels per inch that when you hold something around 10 or 12 inches away from your eyes is the limit of the human retina to differentiate the pixels. And so they’re so close together when you get at this 300 pixels per inch threshold that all the sudden things start to look like continuous continuous curves. Like text looks like you’ve seen in it in a fine printed book. Unlike you’ve ever seen on an electronic display before. And at 326 pixels per inch we are comfortably over that limit and it’s extraordinary.

And this is what Wired’s Brian Chen wrote:

The iPhone 4’s screen may be the best mobile display yet, but its resolution does not exceed the human retina, as Steve Jobs claims.

I know I’m being picky here, but Steve Jobs never claimed the resolution of the Retina display used in the iPhone 4 exceeded the human retina. Chen is comparing apples to oranges. He probably meant to compare the resolution of 326 PPI on the iPhone 4’s 3.5-inch display to the maximum resolution the human retina is capable of viewing, deciphering, resolving. To me there is a big difference, and I had to point it out. The reason I am searching for words to describe what our visual system is doing is because in my opinion it is not simply a matter of measuring the discrete biological capability of sight.*

When Jobs mentioned “the human retina to differentiate the pixels” he was referring to visual acuity, which in turn means the ability to distinguish fine detail. The biological part that enables visual acuity is the cone and is usually measured in cycles per degree (CPD). CPD measures an angular resolution, which is the ability to differentiate one object from another in terms of visual angles. You can read a lot more about visual acuity on Wikipedia. Visual acuity is different from person to person but the theoretical maximum resolution is reportedly 50 CPD, translating into 1.2-arcminute per line pair or a 0.35-mm line pair at one meter. Visual acuity is also limited by diffraction, aberrations, refractive error, illumination, contrast, location of retina being stimulated, pupil size and photoreceptor density in the eye. How does this related to pixels per inch?

TWO EIGHT SEVEN: Thankfully Dr. Raymond Soneira, president of DisplayMate Technologies, did the math converting angular resolution to linear resolution. The result is 477 PPI, which is the theoretical maximum visual acuity of the human visual system, at 12 inches. The 326-PPI Retina Display falls quite short of this limit. But, keep in mind this is a theoretical maximum; for folks with normal visual acuity of 20/20, the PPI we can differentiate would be somewhat less than the maximum 477 PPI at 12 inches. So, what exactly is that PPI for normal folks like you and me? According to the NDT Resource Center and Wikipedia, 20/20 vision has a visual acuity of one arcminute per line pair that means at 12 inches we can differentiate something as small as 0.00349 inches. You can cram almost 287 of these little objects into an inch. And that in turn represents a resolution in pixels per inch of 287.

So when Jobs said, “It turns out that there’s a magic number right around 300 pixels per inch that when you hold something around 10 or 12 inches away from your eyes is the limit of the human retina to differentiate the pixels,” he was generally right. He was addressing ordinary folks and for most of us his statement about 300PPI as a limit of visual acuity was not exaggerated. My guess is the Retina Display will end up in history to be the very first mobile display that took into consideration the general limits of the human visual system. Of course there is that simple coincidence: the number of pixels are doubled on each axis making current iPhone apps look on the iPhone 4 exactly like they have on previous versions (read iPhone HD: 960×640 Confirmed?).

Update: I came upon Phil Plait’s article Resolving the iPhone resolution on Discovery Magazine that came to a similar conclusion as I did. Plait is an astronomer, lecturer, author and worked ten years on the Hubble Space Telescope.

So in a sense, both Jobs and Soneira are correct. At the very worst, you could claim Jobs exaggerated; his claim is not true if you have perfect vision. But for a lot of people, I would even say most people, you’ll never tell the difference. And if you hold the phone a few inches farther away it’ll look better.

I also bumped into Dr. William Beaudot, Founder & Chief Scientist of KYBERVISION Consulting, R&D, who also came to a similar conclusion in his article About Apple “Retinga Display” in iPhone 4:

I believe that for most people a reasonable viewing distance when reading a book or using a mobile device is between 10″ and 20″. Under this normal range of viewing conditions, Apple “Retina Display” would have the capacity to span the full range of normal visual acuity, from 20/20 at 10″ to 20/12 at 18″, further justifying Apple’s claims. That would be my take-home message in this ongoing controversy.

* The interaction between the eyes and the brain is extremely complex. I don’t assume I know enough but it is commonly understood that viewing is reactive: we see and then we recognize. But from a few books I have read it isn’t that simple and it turns out the way we see things is quite the opposite: almost always we already know or expect what we are going to see. This applies to situations and environments that we are familiar with, which are most often the case.

Allow me to point out just one more thing about the complexity of our human visual system. Our vision system, just the part that takes in photons, has a lot of different parts: cornea, pupil, iris, lens, anterior & posterior chambers, ciliary muscle, suspensory ligament, zonular fibres, choroid, sciera, bitreous humour, hyaloid canal, fovea, optic disc, to name a few. The retina is just one of many parts that must all work seamlessly for photon gathering to work. The retina contains a mosaic of two types of photoreceptors: rods and cones.

Rods are sensitive to blue/green light with a peak sensitivity at a wavelength of 498 nm. If you are at all interested in photography the main thrust of development has shifted away from simply adding more pixels and toward building sensors with photo diodes that are more sensitive to light. In some cases the photo diodes are getting bigger in size. What the industry is doing is improving the “rods” of the image sensor: the rods in the human visual system are used for vision in low-light environments. And then there are the the cones, which come in three varieties: L, M and S. The L cones are sensitive to light with a frequency of 564 nm, which our vision system sees as red. The M cones are sensitive to 533 nm (green) and the S cones 437 nm (blue). Most of these cones are concentrated in the fovea, an area near the center of the retina. The maximum concentration of these cones is about 180,000 per square millimeter. My point in providing all this information pointing to the complexity of our visual system is that all of us are distinct with slightly different expectations for what we think we should see and though there will be a common experience there will also be individually unique experiences with displays.