Skip to main content

The iPhone 11’s Deep Fusion camera arrives with iOS 13.2 developer beta

The iPhone 11’s Deep Fusion camera arrives with iOS 13.2 developer beta

/

Apple delayed the release by a day, but it’s out now

Share this story

Photo by Amelia Holowaty Krales / The Verge

Apple’s Deep Fusion photography system has arrived as part of Apple’s newest developer beta of iOS 13, version 13.2 beta 1, hopefully hinting that it will ship for the iPhone 11 and 11 Pro some time soon.

Update October 2nd, 1:32PM ET: This article originally said the next iOS 13 developer beta would be released on October 1st, before Apple clarified an uncertain ship date for that beta. The software has since been released as part of today’s iOS 13.2 beta 1. Deep Fusion is now available for developers to try, while a public beta is expected soon.

To refresh your memory, Deep Fusion is a new image processing pipeline for medium-light images, which Apple senior VP Phil Schiller called “computational photography mad science” when he introduced it onstage. But like much of iOS 13, Deep Fusion wasn’t ready when the phones arrived two weeks ago. And although the iPhone 11 and 11 Pro have extremely impressive cameras, Deep Fusion’s is meant to offer a massive step forward in indoor and medium-lighting situations. And since so many photos are taken indoors and in medium light, we’re looking forward to testing it. Here’s a sample shot shared by Apple:

A Deep Fusion photo of a woman in a sweater
Apple loves showing off Deep Fusion with photos of people in sweaters.

With Deep Fusion, the iPhone 11 and 11 Pro cameras will have three modes of operation that automatically kick in based on light levels and the lens you’re using:

  • The standard wide angle lens will use Apple’s enhanced Smart HDR for bright to medium-light scenes, with Deep Fusion kicking in for medium to low light, and Night mode coming on for dark scenes.
  • The tele lens will mostly use Deep Fusion, with Smart HDR only taking over for very bright scenes. (Night mode always uses the standard wide angle lens, even when the camera app shows “2x”.)
  • The ultrawide will always use Smart HDR, as it does not support either Deep Fusion or Night mode.

Unlike Night mode, which has an indicator on-screen and can be turned off, Deep Fusion is totally invisible to the user. There’s no indicator in the camera app or in the photo roll, and it doesn’t show up in the EXIF data. Apple tells me that is very much intentional, as it doesn’t want people to think about how to get the best photo. The idea is that the camera will just sort it out for you.

But in the background, Deep Fusion is doing quite a lot of work and operating much differently than Smart HDR. Here’s the basic breakdown:

  1. By the time you press the shutter button, the camera has already grabbed four frames at a fast shutter speed to freeze motion in the shot and four standard frames. When you press the shutter it grabs one longer-exposure shot to capture detail.
  2. Those three regular shots and long-exposure shot are merged into what Apple calls a “synthetic long.” This is a major difference from Smart HDR.
  3. Deep Fusion picks the short-exposure image with the most detail and merges it with the synthetic long exposure. Unlike Smart HDR, Deep Fusion merges these two frames, not more — although the synthetic long is already made of four previously-merged frames. All the component frames are also processed for noise differently than Smart HDR, in a way that’s better for Deep Fusion.
  4. The images are run through four detail processing steps, pixel by pixel, each tailored to increasing amounts of detail — the sky and walls are in the lowest band, while skin, hair, fabrics, and so on are the highest level. This generates a series of weightings for how to blend the two images — taking detail from one and tone, color, and luminance from the other.
  5. The final image is generated.

That all takes a tick longer than a normal Smart HDR image — somewhere around a second total. So if you take a bunch of shots and jump immediately into the camera roll, you’ll first see a proxy image while Deep Fusion runs in the background, and then it’ll pop to the final version with more detail, a process Apple says shouldn’t take more than a quarter to half a second by the time switch to the camera roll.

But all of this means that Deep Fusion won’t work in burst mode. You’ll notice burst mode itself has been deemphasized throughout the camera app in iOS 13 since all of these new modes require the camera to take multiple exposures and merge them, and Apple’s new hold-to-take video mode is a little more useful anyway.

Here’s another Deep Fusion image of a beautiful person in a sweater from Apple. It’s certainly impressive. But we’ll have to see how Deep Fusion works in practice as people get their hands on it with the developer beta. If it’s as impressive as Apple claims, the iPhone 11 camera will leap even further ahead of the current competition and set a high bar for Google’s upcoming Pixel 4 to clear.

A Deep Fusion shot of another person in a sweater
Deep Fusion: Sweater Mode