Apple executives discuss camera design and the iPhone 12

The iPhone 12 and iPhone 12 Pro went on sale last month, and the iPhone 12 mini and iPhone 12 Pro Max are available to pre-order now. With that, Apple executives are out there discussing the new devices, fueling the hype train.

This time around it’s Francesca Sweet, iPhone Product Line Manager, and John McCormack, VP of Apple’s Camera Software Engineering. The two executives spoke at length with PetaPixel, with the primary focus begin on the iPhone 12 lineup and Apple’s approach to designing camera systems. For the executives, it’s about “the moment”, and being able to quickly take a photo and get back to what you were doing (if necessary), and not be distracted by the technology itself.

For McCormack, he believes it’s about getting the job done, whittling down the process to capturing a single frame:

We replicate as much as we can to what the photographer will do in post,” McCormack said. “There are two sides to taking a photo: the exposure, and how you develop it afterwards. We use a lot of computational photography in exposure, but more and more in post and doing that automatically for you. The goal of this is to make photographs that look more true to life, to replicate what it was like to actually be there.

McCormack also talked about adding Dolby Vision support for the iPhone 12 lineup’s camera system as well:

Apple wants to untangle the tangled industry that is HDR, and how they do that is leading with really great content creation tools. It goes from producing HDR video that was niche and complicated because it needed giant expensive cameras and a video suite to do, to now my 15-year-old daughter can create full Dolby Vision HDR video. So, there will be a lot more Dolby Vision content around. It’s in the interest of the industry to now go and create more support.

Later in the interview, McCormack spoke about Apple increasing the sensor size in the iPhone 12 Pro Max — one of the distinct advantages of choosing the largest iPhone 12 variant.

It’s not as meaningful to us anymore to talk about one particular speed and feed of an image, or camera system,” he said. “As we create a camera system we think about all those things, and then we think about everything we can do on the software side… You could of course go for a bigger sensor, which has form factor issues, or you can look at it from an entire system to ask if there are other ways to accomplish that. We think about what the goal is, and the goal is not to have a bigger sensor that we can brag about. The goal is to ask how we can take more beautiful photos in more conditions that people are in. It was this thinking that brought about deep fusion, night mode, and temporal image signal processing.

The full interview is absolutely worth a read, so go check it out at PetaPixel.

Did you pre-order an iPhone 12 Pro Max for the camera?