Apple testing Deep Fusion camera feature in iPhone 11 and iPhone 11 Pro in iOS 13 developer beta

Apple has been updating iOS 13 at a record pace, partly to add new features, but, mostly, to patch bugs.

But that doesn’t mean that additional features aren’t still on the horizon. For instance, the Deep Fusion camera feature, which Apple announced at its “by innovation only” event earlier this year alongside the iPhone 11 and iPhone 11 Pro. Apple has always said that the feature would launch at a later date, with a subsequent software update to iOS 13, and now the feature is in beta.

The Verge has the report today, outlining how Apple is working on the forthcoming public launch of Deep Fusion and how the feature will make the already impressive cameras in Apple’s newest flagship smartphones even better.

You can see the result of a photo captured with Deep Fusion in the image at the top of this article.

Deep Fusion is an important feature for “medium-to-low light” images. It’s also a feature that runs entirely in the background, working to optimize and improve images while the iPhone 11/11 Pro owner is taking photos on the go.

The report has a nice breakdown of how it should work in the background:

  1. By the time you press the shutter button, the camera has already grabbed three frames at a fast shutter speed to freeze motion in the shot. When you press the shutter, it take three additional shots, and then one longer exposure to capture detail.
  2. Those three regular shots and long-exposure shot are merged into what Apple calls a “synthetic long” — this is a major difference from Smart HDR.
  3. Deep Fusion picks the short exposure image with the most detail and merges it with the synthetic long exposure — unlike Smart HDR, Deep Fusion only merges these two frames, not more. These two images are also processed for noise differently than Smart HDR, in a way that’s better for Deep Fusion.
  4. The images are run through four detail processing steps, pixel by pixel, each tailored to increasing amounts of detail — the sky and walls are in the lowest band, while skin, hair, fabrics, and so on are the highest level. This generates a series of weightings for how to blend the two images — taking detail from one and tone, color, and luminance from the other.
  5. The final image is generated.

According to the report, the telephoto lens will mostly use Deep Fusion, with the Smart HDR feature only taking over in a shot when there are very bright scenes. The standard wide angle lens, meanwhile, will rely mostly on the Smart HDR feature for bright to medium-bright scenes, and Deep Fusion will kick in with the medium-to-low light scenes. The ultra wide lens will never use Deep Fusion or Night Mode, because it only supports the Smart HDR feature, and neither of the other features.

Night Mode will always kick in, and be the primary feature, in a low-light scenario.

As noted in the report, Apple is currently testing the feature out in a beta of iOS 13. However, that beta is currently not available for developers, nor public beta testers. Apple will likely seed that first beta, with Deep Fusion, sometime soon, possibly even later today as iOS 13.2.

Are you looking forward to the arrival of Deep Fusion?