It may be an expected update, but the next iPhone will take the smartphone camera beyond Apple‘s previous efforts with its revolutionary new hardware and software. Now we have more details about what Tim Cook and his team have planned.
Apple is expected to use Sony’s latest camera sensor technology. Thanks to its stacked sensor technology, which splits photodiodes and pixel transistors into two distinct layers, more light can be captured for each pixel. Sony claims that it can capture up to twice as much light as its current sensors.
Until now, the latest technology was limited to the largest models. If that’s the case, this new sensor and increased low-light capability may be reserved for the iPhone 16 Pro Max, which at least allows Apple to make a case for the more expensive iPhone beyond the fact that “the screen is a little bigger.” ”
Apple will also increase the optical zoom available on the higher-tier iPhone 16 Pro and iPhone 16 Pro Max phones. This will be facilitated by the use of a tetraprism lens, which, much like a periscope, uses prisms to bend the path of light and create a lens longer than the depth offered by the smartphone.
The iPhone 15 Pro allows for 3x optical zoom using said system, but the iPhone 16 Pro and 16 Pro Max will see an upgrade to 5x optical zoom and the expectation of 25x digital zoom as an option when taking photos.
Update: Saturday, April 20: Apple is pushing the limits with Sony’s new stacked sensor, but the quest to gather more information through more megapixels with more precise image sensors continues.
Adam Juniper reports on Apple’s plans for the iPhone 17 camera with a main spec of 144 megapixels. It would be made up of three lenses, all of them with 48-megapixel sensors. The iPhone 16 family should upgrade all main cameras to 48 megapixels, while the iPhone 17 Pro will build on that with the 48-megapixel tech coming to the telephoto and ultra-wide cameras.
This doesn’t necessarily mean that Apple will default to a 48-megapixel image. The additional information will make it possible to use advanced techniques currently in use, such as pixel binning (taking four pixels from the sensor and combining them into a single but more precise pixel) and upcoming artificial intelligence-based techniques that Apple will no doubt debut at its World Wide Web Conference. Developers. in May 2024.
Many of the competing Android smartphones feature 100-megapixel cameras. Apple may not be able to match that this year, but it will have the option for future iPhones.
In terms of software, Apple will finally join the AI revolution with the iPhone 16 family of smartphones with a series of AI-integrated features to improve the way tens of millions of consumers capture, process, and edit images. While AI routines have been part of Apple’s camera suite in previous years, the recent push by Android makers (particularly Samsung and Google) to brand their phones as AI-powered phones has left the iPhone behind.
The iPhone 16 family will be Tim Cook’s first opportunity to sell hardware with a particular focus on AI, and you can expect the visual difference it makes in photos to be both an easy thrill to sell and a powerful on-stage demonstration when launching the next iPhone. released in September.
Before that, we’ll get our first look at Apple’s AI efforts and hint at what’s to come at the Worldwide Developers Conference in June.
Now read about Apple’s research work detailing how AI will be able to read your screen and help you navigate your iPhone…
Keynote USA News
For Latest Apple News. Follow @Keynote USA News on Twitter Or Google News.