[ Apple ] Here are a few takeaways from today’s iPhone 7 announcement by Apple.
Optical Image Stabilization
This definitely deserves a finally. Finally, the regular non Plus version of the iPhone 7 also gets optical image stabilization. It sure did take Apple a while. With the non Plus version of the iPhone 7 you should be able to get better shots off even if you’re hands are a little jittery, and a bit better photos when there’s not a lot of light.
The lenses are getting faster — allows 50% more light into the camera sensor than the iPhone 6s according to Apple — but it’s not the fastest the Samsung Galaxy S7 Edge sports a faster aperture of f/1.7. The smaller the number the better it is at gathering light. Aperture isn’t everything, but it’s one of the most important part of making a solid camera. Note: The second camera — the telephoto camera — in the iPhone 7 Plus sports an aperture of f/2.8.
Quad-LED True Tone Flash
The flash is 50% brighter than the one in the iPhone 6s. The color temperature of the flash adjusts depending on the color temperature of the environment. Now this is something professional camera brands like Canon, Nikon, Sony, Samsung, etc. should take note of: develop a flash that senses the color temperature of the environment and uses that exact color temperature for the flash, and sell it for under a gazillion dollars.
12 Megapixel Image Sensor
Apple didn’t say if the image sensor is larger than the one in the iPhone 6s or the 6s Plus, but that might actually be the case because the camera bump is way larger. Make the iPhone thin and don’t mind the camera bump. Who made this decision? In my opinion it was the wrong decision.
More pixels don’t mean better photos unless the pixels on the image sensor are better at capturing light. They need to be better because the smaller those pixels the worse they are if you keep the technology the same. I’m guessing Apple is using software — algorithms to be more specific — to make up for the hardware deficit by going for more megapixels.
Image Signal Processor (ISP)
The Apple-designed ISP is built into the A10 Fusion chip. The ISP enables the iPhone 7 and the 7 Plus to make 100 billion operations every time you take a shot. What are those operations? Faster focus, improved local tone mapping, adjusting white balance, and a bit of machine learning.
On the iPhone 7 Plus there are two cameras. Both are 12 megapixel cameras, but one is wide-angle (the one in the iPhone 7) and the other is a telephoto camera. Apple is using the two cameras to enable 2x optical zoom and up to 10x digital zoom. In a future update there will be a digitally created bokeh feature in Portrait mode.
I think it’s safe to say the iPhone 7 and 7 Plus cameras will be one of the best cameras on a smartphone. But the thing I’m not sure about is the machine learning, ISP-based digital manipulation of photos. I have a bad taste in my mouth.
A friend of mine was telling me about another friend who is an amazing photographer. He showed me his Instagram account. He had a beautiful landscape photo with the night sky lit up with stars. Wait, I wouldn’t call it a photo; it was more a rendering or a compilation. My friend explained to me the landscape part of the photo was taken separately from the night sky, and then merged together in Photoshop.
This is how I decide whether or not a photograph is a photograph, and it’s pretty simple: if I were there can my eyes see it? If the answer is yes, then it’s a photograph. If not, then not. Clearly the iPhone 7 isn’t doing what this friend of a friend is doing, but I’m not sure the photos coming out of the iPhone 7 and the 7 Plus will feel as organic. If the photos coming out of a Leica camera are on the organic end of the spectrum the multi-photo Photoshop’ed photos would be on the other end. I hope the iPhone 7 photos will not tend toward the Photoshop’ed end.