iPhone 4S

The iPhone 4S is many things but it is not a mere evolutionary iPhone. The iPhone 4S is revolutionary.

DISPLAY

Many were expecting a totally revamped iPhone 5 with a larger display. That was reasonable considering the iPhone 4 has been on the market for 18 months and we’ve been accustomed to incremental enhancements every year and totally new iPhones every two. But let’s take a closer look at some of these expectations and figure out if those expectations were really reasonable.

Let’s start with the the display. Just look at the competition: displays are getting big, really big. These guys are packing more pixels too, more than even the 960×640 you see on the iPhone 4’s Retina Display. The latest and greatest 4.x-inch displays are moving to 1280×720. These specs on a smartphone is close to nirvana to videophiles: pure, unscaled 720p HD video playback. Larger displays, more and more pixels certainly had an impact on our expectations for the next iPhone.

The iPhone is more than just a movie playback device. Actually, if I were to rank the importance of HD video playback it would at best be number two. The most important aspect of the iPhone is that it is the culmination, the end point of tens of thousands of amazing developers creating brilliant apps, using smooth-as-butter tools, for a rock solid hardware platform including a stable display subsystem. 3.5 inches. 960×640. 326 ppi. These three unwavering specs allow for tremendous creativity to pour out of Apple’s iOS developer ecosystem.

The 326-ppi Retina Display reigns supreme when it comes to resolution. And there is no need to go any higher. And if it did, there would be difficult challenges. A bigger display sounds unlikely too. What would a larger iPhone display look like?

A larger iPhone display will need to maintain Retina Display status and be at least 326 ppi. Apple wouldn’t step down in terms of what I believe is the most important specification on the display. And herein lies the challenge. A larger display will need more pixels to maintain the same resolution. But once you consider adding more pixels there is only one option: the pixels will need to double, again. Remember the Retina Display is there for us to enjoy, but we only get to enjoy all the amazing apps when the developer ecosystem is healthy, and health comes from a stable display subsystem. So pixels will need to be doubled. Doubling the pixels would mean a pixel format of 1920×1280. We’re already up to 1280×720 on a 4.5-inch display so pushing that a bit more to 1920×1280 might become a reality but will take some time. A 4.0-inch 1920×1280 display would mean a resolution of 577 ppi. Incredibly high, but not impossible, and it would make it very easy for developers double up on pixels.

But I don’t think this is where Apple is going. I think 326 ppi is more than good enough on a smartphone. In my mind 3.5 inches, 960×640, and 326 ppi will be with us for a while. These three specification pillars will be what all other display related advancements will build upon. Instead of cramming more pixels into the display Apple will likely focus on improving those pixels.

What can be improved upon? First is color. The 3.5-inch IPS LCD panel is 8 bit, meaning each sub-pixel can display 256 levels of gray for a total of 16.7 million colors per pixel. That’s plenty of colors but the quality of those colors need to be better. Reds need to be more red. Green more green, and blue more blue. To do that the color gamut needs to be widened coupled with color calibration capabilities. Apple might work directly with color filter manufacturers. Or pigment suppliers. Or maybe with manufacturers for the backlight unit, optical films, or LEDs. There are a large number of components that make up the 3.5-inch IPS LCD. And all will need to be improved, but I think there is a better way.

There are a fair number of really smart folks that look to OLED displays as the solution for a thinner display with better color and contrast. But there are three big problems that need to be fixed. One is differential aging or in plain English the red, green, and blue OLED materials get dimmer at different rates. The biggest challenge is to get the blue OLED to last longer, but overall OLEDs don’t last long enough especially if they are driven hard to produce bright text, images, videos, etc. One observation is blue screens on smartphones in South Korea. Because the blue OLED dies out faster there is compensation circuitry built into most OLED smartphones. The blue OLED is driven harder to compensate for its faster decline in brightness. So two to three year old smartphones tend to have a bluish tint on the OLED display. That also means the experience of viewing those OLED display is terrible and close to being unusable.

The second big problem is power consumption. OLEDs on average consume considerably more power than LCDs, especially when you’re doing anything other than viewing video. Video is different because on average the content has lower brightness than say a presentation slide or a webpage. There are two ways for OLEDs to last longer and consume less power: redesign the user interface (UI) to be mostly black similar to RIM’s current BlackBerry operating system (OS) and/or focus on video playback. The later seems to be what the competition is focusing on with 16:9 aspect ratios.

The third big problem is OLED sub-pixels at the moment can’t be made as small as sub-pixels on a LCD. Using a RGB sub-pixel structure it is not possible at the moment to cram 960×640 pixels into a 3.5-inch display. This is one reason why Samsung is pushing for bigger displays on its smartphones. The only way to get 1280×720 pixels on an OLED display is to make it about 4.5 inches. Resolution in terms of ppi on an OLED display lags behind those of LCDs, by a good margin. So I don’t see Apple incorporating OLED displays in next generation iPhones until these problems are fixed, which may take a while. But that doesn’t mean OLED is completely out of the picture for future iPhones. OLED technology can be used in a different way.

One thought before we move on. There is the possibility, albeit small, that Apple would develop a larger 4.x-inch iPhone. If Apple took this route it would mean the decision was made that there was something more important than maintaining the 326 ppi resolution and worth lowering. Even with a bigger display I don’t think Apple would increase the number of pixels that weren’t double what we have today and since that would be difficult I’m assuming it will stay the same at 960×640. The only thing I can imagine that is worth reducing resolution for is adopting OLED display technology. Not any old OLED display but transparent OLED panels, two of them, for a real 3D display. This could be possible but I still think Apple will at least maintain a resolution of 326 ppi on its iPhones going forward. Maybe one day OLED sub-pixels can be made as small as LCD sub-pixels, last as long, and overcome the differential aging problem. When those conditions are met Apple may develop an iPhone with an OLED display with the possibility of using two transparent ones for a true 3D display. For now though I think Apple may use OLED technology in a different way.

There is RGB OLED display technology. And there is white OLED lighting technology. White OLEDs can be used as a light source for LCDs. We already see white OLED lights on the market. Instead of the LED backlights that are currently being used as the light source for LCDs, white OLEDs can be used. And there are some significant benefits to doing this.

There are two kinds of LED backlights: edge-lit and backlit. Edge-lit LED backlights are found from iPads to MacBooks to iMacs. Pretty much everything aside from high-end TVs, which have backlit LED backlights using hundreds or even thousands of LEDs. With a backlit LED backlight you get local dimming that greatly improves contrast, but you also get a slightly thicker, more expensive, and power hungry display. Edge-lit LED backlights are thinner, cheaper, consumes less power but don’t get you performance as great as backlit. Replace the LED with OLED and you get thinner and great performance, greater performance actually.

OLEDs can be made really thin so it can be integrated into a backlit design. And because OLEDs can be made much smaller than LEDs local dimming can be enhanced to the pixel level. A black pixel on an OLED-backlit LCD will be absolutely black, just like an OLED display. There are two things holding this solution back: power consumption and cost. It’ll be a while before we see an OLED-backlit LCD, but I believe it’s coming. In the meantime Apple will concentrate on improving LCDs with LED backlights but the focus will not be on cramming more pixels into the display or making the display bigger. Instead Apple will likely focus on enhancing those pixels. So I’m not expecting a larger display on the iPhone 5, just a much better one.

CAMERA

It may not have optical zoom, a really big lens, or tons of megapixels, but the integration of key component advancements make the camera system in the iPhone 4S the very best on a phone today. Apple made major advancements to three components: the image sensor, lens, and software.

I don’t like to see the same-sized image sensor packing more pixels. That to me only means one thing: image quality has gone down. In the case of the iPhone 4S camera system there is ample evidence that image quality has gone way up despite a 60% increase in the number of pixels from five megapixels to eight on the same-sized image sensor.

The eight megapixel CMOS image sensor have pixels with higher, full-well capacity. This simply means that these pixels can capture more light than previous sensors. Like the image sensor on the iPhone 4, the one in the 4S makes use of backside illumination or BSI. Simply put BSI technology puts all of the circuitry from the top of the sensor to the backside or the bottom. Circuitry on top of the sensor blocks light and the goal is to get as much light to be absorbed by each pixel on the sensor. BSI allows more light to get to the pixels by minimizing light blockage. BSI combined with higher, full-well capacity pixels should bode well for superb images on the iPhone 4S.

Apple integrated an hybrid infrared (IR) light filter to prevent IR contamination, which can affect the accuracy of colors resulting in color shifts. Most high-end digital SLRs (DSLRs) incorporate glass-based IR filters over the image sensor and Apple has incorporate it into its iPhone 4S, a smartphone. Without an IR filter a magenta or purple cast can appear in certain photographs. For example skin can look sunburned. More light with more accurate colors is absorbed. A good start.

The iPhone 4S uses a custom lens. I’m guessing Apple took a lens subsystem that was readily available and worked together with a manufacturer to customize it so I don’t think you’ll see this on other phones. The lens component on the iPhone 4S uses five precision elements, up from four in the iPhone 4. One of the features Apple focused on are sharper images. You get that with an increase in higher performing pixels on the image sensor but you also need better optics to match that sensor. So my guess is the extra lens element in the iPhone 4S counteracts blurring resulting in improved sharpness.

The lens also has a higher aperture and a larger aperture means more light passes through the lens. The iPhone 4 lens system had an aperture of f/2.6. The five-element custom lens system on the iPhone 4S sports a brighter f/2.4. Low-light photography gets better with a faster, brighter lens. Although the iPhone 4’s low-light photo capabilities were significantly better than the 3G/3GS camera system, Apple overhauled every element to improve the overall experience of taking pictures and videos in low-light environments on the iPhone 4S. A sharper, brighter lens passes light to an eight megapixel image sensor that absorbs more light per pixel with more accurate colors. All of these improvements point to a vastly superior camera system on the iPhone 4S.

The dual-core A5 makes everything faster, including taking photos. With zero shutter lag photo is captured the instant you press the volume up button. So that’s hardware.

On the software side, iOS 5 brings two features that make the iPhones a better photographic tool. One is the use of the volume up button as the shutter button. This makes taking pictures using both the iPhone 4 and 4S more camera like. Instead of tapping on the screen you press a physical button. For me tapping on the screen has been at best uncomfortable and I wholeheartedly welcome the use of the volume up button as the shutter.

The second feature of iOS 5 puts the camera icon right next to the unlock bar. Instead of having to unlock the iPhone first you simply tap the camera icon and start shooting. This greatly reduces the time it takes from “I gotta get this shot!” to “Oooh. Take a look at this photo!”

Video capture has been improved from 720/30p to 1080/30p. Although this is welcome, I’d like to see 1080/60p in the very near future. Most TVs, monitors, notebooks, etc. refresh the display at 60 fps. Of course, for film buffs the ability to capture 1080/24p is a must.

Real-time video image stabilization has been coupled with 1080/30p video. My guess is that it is a digital IS implementation instead of an optical one but from the presentation given by Apple the effect was notable: video shake was reduced and watching the video no longer induced headaches. In addition Apple introduced temporal noise reduction during video capture. After searching online and reading some white papers I have little to no understanding of what temporal noise reduction is and how it works. Nonetheless I hope Apple didn’t implement noise reduction too aggressively as it can degrade image detail and clarity.

As I was writing this article I came across an unblur plugin for Photoshop that was demonstrated during the MAX conference. What this plugin does is analyze what motion led to the blurring of the photo and unblurs it. Quite amazing, but it requires some heavy duty computational power. I can’t imagine Apple not integrating an even more advanced version of this into a future camera app. Imagine for a second an automatic unblur feature that makes every photo and video you take as perfect as they can be.

The camera system in the iPhone 4S has been completely revamped with features that are found in much more expensive cameras. I would be surprised to see folks carrying both a point-and-shoot compact camera and an iPhone 4S. And I wouldn’t be all that surprised if some folks decide to sell their bigger more expensive cameras. The best camera is the one you have at the moment and for a great majority of owners that camera will be the iPhone 4S.

SIRI

The Defense Advanced Research Projects Agency or DARPA has a program called Perceptive Assistant that Learns (PAL). DARP awarded Stanford Research Institute (SRI) the first two phases of a five year contract to develop “an enduring personalized cognitive assistant”. SRI named this project Cognitive Agent that Learns and Organizes or CALO. The goal for CALO was to develop software that learns from interacting with and receiving advice from users to conduct automated interrelated decision-making tasks. Lots of acronyms and lots of fancy sounding words, but I’m almost to the important part.

Adam Cheyer was Program Director of SRI’s Artificial Intelligence Center and Chief Architect of CALO. Cheyer also founded Siri, Inc., which later was purchased by Apple. He is now Director of Engineering in the iPhone group. DARPA. PAL. SRI. CALO. Adam Cheyer. And now Siri on the iPhone 4S. There’s something big going on and here are my thoughts. But before I get started, a quote from Steve Jobs during his iPad 2 keynote presentation in March 2011:

It is in Apple’s DNA that technology alone is not enough—it’s technology married with liberal arts, married with the humanities, that yields us the results that make our heart sing.

The main point I’d like to make is this: Apple’s understanding of the human heart goes deep down and the efforts of hundreds of brilliant minds bear fruit in the iPhone 4S, and especially in Siri.

Let’s look back and see if we can identify some milestones that led us here, that led Apple here. The term personal digital assistant or PDA became commonplace when the smart folks at Palm came out with the Pilot. I have a working Pilot Professional and it is brilliantly simple to use. Once you learn the idiosyncrasies of Graffiti it’s a breeze to enter notes, calendar items, and contact information. There is a lot more you can do with the Pilot but the three I mentioned were what most people did most of the time. Now the Pilot wasn’t quite a personal digital assistant; it was more of a portable tool that digitized our notes, calendar, and address book.

The next major milestone was the smartphone. The smartphone combined features of the PDA and the mobile phone. By marrying the two we gained more conveniences. For instance, we were able to dial directly from the address book by selecting the person we wanted to call. And vice versa: save the phone number from the caller ID service directly into our address book. We also gained location and direction information thanks to technologies like GPS, maps, gyroscope, proximity sensor, etc. But the smartphone was still merely a portable tool. Better, with more conveniences, but we continued to do all of the work. If we wanted to call a friend we had to launch the address book, find the friend, and select the number we wanted to dial.

Combined with the PDA and the smartphone, I consider voice recognition technology as the third and last milestone. Let’s use calling a friend as an example. Instead of the three step process I mentioned above, with voice recognition we press the home button until Voice Control engages. We then say, “Call John Doe.” Voice Control will either respond by confirming that it is calling John Doe or ask for more clarification if you have more than one number associated with it. It works most of the time and adds even more convenience albeit in a limited way. Limited because Voice Control works only with a limited number of apps on the iPhone. Siri is the next milestone and takes voice recognition a giant leap forward.

I’d like to frame Siri in the context of concierge services. A concierge in a broad sense is someone who takes care of you. If you’re at a hotel and want to know the best Italian restaurant nearby you call the concierge desk. Executives have a different type of concierge: their executive assistants. Need long stem roses ordered, paid for, and delivered to your wife? Just ask the executive assistant. And then there is a different type of concierge services, ones that come with the products you buy. I’ll talk about two.

One is the American Express Black Card Business Centurion. With it you gain access to its personal concierge service. Just like an executive assistant you call your personal concierge and ask for practically anything: dinner reservations at your favorite restaurant, finding a rare piece of antique jewelry, and more. The other is Nokia’s Vertu, a really expensive mobile phone that comes with Vertu Concierge. Press a button on the Vertu and you’re connected to your concierge who assists you with almost anything you can imagine.

Siri is a concierge, built in. Instead of a real person Siri is digital so there are some limitations, but its understanding based on the context of the conversation is almost humanlike. By linking to other services like Wolfram-Alpha, Siri can significantly augment its access to information you are looking for. Siri is in beta, but the demo during the “Let’s talk iPhone” Apple event was absolutely amazing. Adam Cheyer and his team at Apple has accomplished something truly remarkable.

What is a phone? And what will we expect from a phone? What type of experiences will we demand from our phones? To start off there must be complete integration between software and hardware. Aside from Apple no other company. Siri cannot work if it didn’t have complete access to a perfectly-tuned hardware software system. I believe David Pogue coined the term “app phone” to distinguish smartphones that have a platform for app development and distribution from those that do not. I think the iPhone 4S goes beyond that definition. The iPhone 4S changes the game and is the birth of something entirely new.

In Scandinavia, Siri is a girl’s name that means beautiful and victorious. Isn’t it so fitting. Since returning to Apple in 1997 Steve Jobs beautifully crafted the company to become the most valuable company in the world. Number one. And Apple is a company that in turn fuses technology and the humanities together to form beautiful products and services that we love. I firmly believe Siri is and will be one of Steve Jobs’ many brilliant, beautiful victories.

LOVE

As I began writing this article Steve Jobs passed away. We live in a world filled with people who are habituated with the way they do things only because they’ve been doing those things for a long time. Imagine how much spiritual, mental, physical, and organizational friction Steve Jobs must have fought through to get everyone on board, to get everyone working together to the best of their capacity, to complete the first iPhone. He did this over and over again, with industries, with companies, with people. Steve Jobs believed he could change the world, a world inherently and stubbornly resistant to change. He did change the world, many times over. But it cost him his life.

In my mind the iPhone 4S is special. The S stands for speed of course. It may also stand for Siri. But for me it stands for Steve. Apple looks out four to five years into the future and plans a series of updates to their products and services so I’m sure we’ll continue to see some amazing stuff from Apple in the next several years, stuff that came straight out of Steve Jobs’ heart and mind. But the iPhone 4S might very well be the last iPhone that Steve Jobs saw through from beginning to end.

I’ve only seen Steve Jobs in person twice. The first time was at MacWorld when he introduced the original iPhone. He introduced the future of smartphones, and I was mesmerized. Six months later the iPhone went on sale on June 29, on my birthday. It was fate. Later that day I made a spirited drive through Stevens Creek to the Valley Fair Apple Store. The applause, the smiles, the excitement around me, they were all a blur. All I wanted to do was sit down, and experience my very own iPhone.

The second time was much later. I had lunch with a friend at the cafeteria at 1 Infinite Loop and as I came out of the building, and after crossing the street to the parking lot, I noticed him. He was thin but healthy looking and in his usual attire. He looked a lot taller than I had imagined. And I noticed a slight bounce in his walk. A joy in his steps. Enthusiasm. Energy. I’m not absolutely certain but I think he was smiling. My first instinct was to cross the street and say hello, but I didn’t want to bother him. So I just stood there, watched him stride into Building 1, and disappear. Those were the two times I saw Steve Jobs in person.

A lot of people have shared their reflections about Steve Jobs’ Stanford commencement speech and for good reason. He opened his heart and showed us a glimpse of himself, an ordinary person, with extraordinary candor. He shared a lot of wisdom and one that stuck with me was his charge to everyone to find what you love, to be who you are. You can’t find what you love unless you have the courage to be who you are; and you can’t be who you are unless you’re doing what you love. Time is limited and we shouldn’t waste what little time we have on doing things we don’t love, trying to be who we are not. He clearly took his own advice. He found love alright. Two in fact: Apple and his family. And because he was who he was we were all better for it.

That little bounce in his steps, that’s probably just how he walked. I’d also like to think the little bounce was that extra dose of joy as he went back and forth between the two loves of his life.