iPhone 7 camera: What rumoured ’dual-lens’ system actually means for your photos
The camera is likely to be the headline feature of the new handset – so long as people can be distracted from getting angry about the missing headphone jack

Apple’s new iPhone’s headline feature – so long as the company has anything to do with it – is likely to be its camera. But that might be a little confusing.
Apple has been working hard to improve its photography tech in recent years, and the iPhone has gradually become far and away the most popular camera on photo-sharing sites. But the latest update – which might only come to the Plus phone, at least initially – might be the biggest step yet.
The technology is exactly as it sounds: instead of just letting light in through the one lens, the camera has two – either exactly the same or with different technology. It can then combine the two images or use the best of both.
Dual-lens camera setups can be used for a variety of functions. Those include extra detail in different situations like low light, generating images of the kind of depth that is achieved with an SLR and using them for 3D imaging or augmented reality.
It’s likely that Apple will focus at least on the second of those: using the two lenses to make the phone take far deeper pictures. It can do that because it can use one lens with a short aperture for a very shallow depth of field, and another with a much longer one that can create less blurry images.
That blur that is produced by short aperture lenses is referred to as “bokeh”, and was the image included on the invitation to the Apple event. So it appears that it will also be a central part of the presentation.
But Apple could use other examples, too. Dual-lens cameras can include one lens meant for picking up light in dark rooms, for instance.
And more dramatically, they can be used to take two images that can then be stitched together using software.
That can then allow people to change the focus of a picture after it was taken, for instance, getting rid of the accidental blur that can come when a picture is wrongly focused.
Such images can even be used for 3D mapping, since by using two lenses the phone can calculate the difference between objects, as our eyes do. It’s not likely that Apple will do that initially, but it could be on its way with future phones – especially given Apple’s interest in virtual and augmented reality.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments