Don’t expect a big camera changes to the iPhone 13, at least when it comes to lenses, one well-respected Apple analyst has warned, with company tipped to stick with tried-and-true tech for its upcoming smartphones. The current iPhone 12 range varies with the offer of two to three cameras – in addition to the LIDAR Camera on the iPhone 12 Pro Max – with all models getting a wide-angle lens and an ultra-wide angle lens.
The iPhone 12 Pro and iPhone 12 Pro Max then throw in telephoto lens. Apple’s flagship version also switches up the lens it uses on its telephoto camera, with a bigger aperture for even more light capture.
With camera technology arguably the biggest single factor for smartphone purchase decisions, exactly what Apple does and doesn’t change with each iteration of iPhone is closely observed. For the iPhone 13, however, expected to debut late in the second half of 2021, it sounds like tempered expectations may be appropriate at least for hardware, even if software leaks have been promising.
According to a note to investors by Ming-Chi Kuo, the analyst suggests the iPhone 13 mini, iPhone 13, and iPhone 13 Pro will all stick with the same wide-angle lens as on their iPhone 12 predecessors, MacRumors reports. That means the 7P optics with f/1.6 aperture. Sunny Optical is tipped to be the key supplier for the wide-angle lenses.
The iPhone 13 Pro Max, however, is said to be seeing changes from the current model. Unlike the f/1.6 aperture lens on the iPhone 12 Pro Max, the replacement will switch to an f/1.5 aperture lens, Kuo claims. It’s a small increase, but it might still help Apple as it further positions its most expensive iPhone as offering the most photographer-friendly features.
Of course, lens type is only one part of the image equation. Previous analyst reports have suggested that Apple may be including larger sensors for at least some of its cameras, on the iPhone 13 Pro and iPhone 13 Pro Max. Currently, both use 12-megapixel sensors across all three of their cameras: ultra-wide, wide, and telephoto. Indeed, that parity of sensor is partly what Apple says allows it to do smoother transitions between modes, among other things.
As we’ve seen grow more obvious in recent years, while camera hardware is obviously a considerable part of the overall image experience, photos and videos rely just as much on software and processing too. Apple has rolled out functions like Deep Fusion and Smart HDR 3 to address that, but computational photography is expected to continue to be a key area of focus in the iPhone 13 and beyond. Indeed, even if Apple keeps the same lenses and some of the same sensors, improvements in post-processing could coax much better stills and footage out of them, in addition to potentially improving existing iPhone models’ performance via iOS updates.
We are sorry that this post was not useful for you!
Let us improve this post!
Tell us how we can improve this post?