Deep Fusion

When Apple introduced the iPhone 11, iPhone 11 Pro and iPhone 11 Pro Max, along with the improved camera systems, it also showcased a feature called Deep Fusion. This will actually make an already impressive camera even better, but unfortunately it wasn’t ready for public use when the new iPhones launched back in September.

That is changing, finally, with the public launch of iOS 13.2. Deep Fusion is now available to the public, which means that photos taken with the cameras in the iPhone 11 and iPhone 11 Pro will get even more impressive — especially if you like to wear sweaters (this is only kind of a joke).

That’s the simple explanation for a truthfully complicated process, one that requires machine learning, multiple exposures of a single shot, and everything coming together in the background to produce even better photographs in specific situations. Deep Fusion won’t be used for every photo taken with the iPhone 11’s dual cameras or the iPhone 11 Pro’s triple camera system, but the goal is that when it does get tapped in the results speak for themselves.

Here’s Apple’s basic description of the feature:

iOS 13.2 introduces Deep Fusion, an advanced image processing system that uses the A13 Bionic Neural Engine to capture images with dramatically better texture, detail, and reduced noise in lower light, on iPhone 11, iPhone 11 Pro, and iPhone 11 Pro Max.

Even if it is just about the details.

So why is Apple adding this? It all comes down to the results for photos captured in medium-to-low-light scenarios. That’s where it stands out. But let’s go ahead and break down how it works first.

Better low-light imagery, less noise

With Deep Fusion, when you press the camera shutter button the software has already captured three frames from what you’re capturing. This is does with a fast shutter speed in an effort to get a freeze fame of the shot. And after you press the shutter button, the software captures three additional frames and at the same time captures a longer exposure to get even more detail.

To reach what Apple calls a “synthetic long”, the three normal frames are then combined with the long-exposure shot.

Deep Fusion will then select the short exposure image with the most detail available, and it will then automatically combine that with the synthetic long exposure that was created in the background. These two frames are what merge together, nothing more. (That’s what primarily sets Deep Fusion apart from the Smart HDR feature, at least when it comes to process.)

Next, additional processing. Pixel-by-pixel the shot is run through an additional four steps, all in the background and basically instantaneously, all working to eek out the most detail. The sky and walls in a given shot work on the lowest band. Hair, skin, fabrics, and other elements are run through the highest band. Deep Fusion will select details from each of the exposures provided to it to pick out the most detail, color, luminance, and tone for the final shot.

After all that is done —again, this is all happening in the background with a ridiculous amount of speed— the final image is generated.

A shot that Apple helps demonstrate Deep Fusion is showcased at the top of this article. So far Apple has really enjoyed showing off Deep Fusion used with a single individual wearing a sweater.

Smart HDR

Now, as mentioned above, the camera systems in the iPhone 11 and iPhone 11 Pro won’t always use Deep Fusion, and some cameras won’t have access to the feature at all. Here’s how that looks:

The telephoto lens will use Deep Fusion for the most part, but Smart HDR (High Dynamic Range) will kick in when the scenario includes very bright scenes. The standard wide angle lens will use Deep Fusion for medium- to low-light situations, while the Smart HDR feature will activate in bright to medium-bright scenes. Lastly, the iPhone 11 Pro’s ultra wide lens will not ever use the Deep Fusion feature because it only supports Smart HDR.

And it’s worth noting that the Night Mode feature Apple introduced with the newest iPhones is what will handle primarily low-light photographs, with Deep Fusion sitting out of that specific scenario.

Deep Fusion is possible because of the A13 Bionic processor available in the iPhone 11, iPhone 11 Pro, and iPhone 11 Pro Max. As a result, while this would probably be a killer feature on older iPhones, those aging units will not be able to take advantage of the feature.

Photographers, and even the people who might not call themselves that but take their photos very seriously anyway, will probably get the most out of Deep Fusion. However, any bump in improvement to photos captured with a phone like the iPhone 11 or iPhone 11 Pro —which are not cheap devices— is a welcomed addition.

Are you looking forward to the addition of Deep Fusion? Let us know in the comments!