iphone 13 pro max camera

Deep Fusion is an image processing system that Apple first introduced to the iPhone 11 series, and here’s how it works.

When you’re trying out the camera on your new iPhone, you might notice a few different modes or filters that you can play around with to change how your photos look. However, some of the most important features work in the background without a visible presence in the app — and one of these is called Deep Fusion.

This feature was announced with the launch of the iPhone 11 series, back in 2019, and this is how then-Senior Vice President of Worldwide Product Marketing Phil Schiller described the technology when it was first unveiled:

It shoots nine images. Before you press the shutter button it’s already shot four short images, four secondary images; when you press the shutter button it takes one long exposure and then in just one second, the Neural Engine analyses the fused combination of long and short images, picking the best among them, selecting all the pixels, and pixel-by-pixel, going through 24 million pixels to optimise for detail and low noise… This is the first time a neural engine is responsible for generating the output image. It is computational photography mad science.

You can watch this explanation, with image illustrations, at Apple’s September Event 2019 in the below video (starting from 1:21:54):

More concisely, Apple explained it in its official press release as follows: “Deep Fusion uses advanced machine learning to do pixel-by-pixel processing of photos, optimizing for texture, details and noise in every part of the photo.”

Image Credit: Apple (iPhone 11)

You should find that detail is reproduced to a highly impressive extent thanks to this tech, even when working under more challenging conditions. The examples given by Apple, including the image above, show challenging and multi-textured subjects like woolly jumpers and skin, under less-than-ideal lighting, that are shown off in all their glory after this exacting procedure has all taken place in just a second.

The Deep Fusion feature is most useful (and is intended to kick in automatically) for indoor and medium-light settings. As previously mentioned, it is not visible on the phone’s interface: you will not find a “Deep Fusion” button in the camera app, and there is not a simple switch to turn it on or off. The Verge cites an Apple spokesperson as saying that this ‘is very much intentional, as it doesn’t want people to think about how to get the best photo. The idea is that the camera will just sort it out for you.’

Which Apple devices have Deep Fusion?

The following smartphones all are capable of using this technology:

  • iPhone 11
  • iPhone 11 Pro
  • iPhone 11 Pro Max
  • iPhone 12
  • iPhone 12 mini
  • iPhone 12 Pro
  • iPhone 12 Pro Max
  • iPhone 13
  • iPhone 13 mini
  • iPhone 13 Pro
  • iPhone 13 Pro Max
  • iPhone SE 3

You might like…

Apple Mac Studio vs Mac Pro vs Mac Mini: Which Mac wins?

Apple Mac Studio vs Mac Pro vs Mac Mini: Which Mac wins?

Gemma Ryles
2 hours ago

Apple is becoming too reliant on its M1 Mac chips

Apple is becoming too reliant on its M1 Mac chips

Ryan Jones
4 hours ago

What is Center Stage? Apple’s front camera tech explained

What is Center Stage? Apple’s front camera tech explained

Hannah Davies
4 hours ago

Apple M1 Ultra: Everything you need to know

Apple M1 Ultra: Everything you need to know

Ryan Jones
5 hours ago

The Apple TV Plus entry into sports may not pan out the way you think

The Apple TV Plus entry into sports may not pan out the way you think

Kob Monney
5 hours ago

Apple Studio Display vs Pro Display XDR

Apple Studio Display vs Pro Display XDR

Gemma Ryles
5 hours ago

The post What is Deep Fusion? Apple’s image processing tech explained appeared first on Trusted Reviews.

More on…www.trustedreviews.com

Share this post