(Reuters) – When Apple Inc. ( AAPL.O ) unveiled its triple-camera smartphone this week, marketing chief Phil Schiller spoke about the device's ability to create the perfect image by knitting it with eight separate exposures taken before the main shot is a feat of "computer science crazy photography".
FILE PHOTO: CEO Tim Cook unveils the new iPhone 11 Pro at Apple's event at their headquarters in Cupertino, CA, September 10, 2019. REUTERS / Stephen Lam – HP1EF9A1EM211
"When you press the shutter button, a long exposure is made, and then in just one second the neural engine analyzes the combined combination of long and short images, selecting the best of them, selecting all pixels and pixel by pixel, going through 24 million pixels to optimize for details and low noise, "Schiller said, describing u nktsiya called "Deep Fusion", which will be delivered later this fall.
This was their technical indentation, which in the past could have been saved for the story of design boss Johnny Ive about a precise aluminum milling process to produce clean iPhone lines. But in this case, Schiller, the company's most enthusiastic photographer, garnered his highest praise for personalized silicon and artificial intelligence software.
The battlefield of the smartphone camera technology industry has moved inside the phone, where sophisticated artificial intelligence software and special chips play a major role in how the pictures of the phone look.
"Cameras and displays sell phones," says Julie Ask, vice president and chief analyst at Forrester.
Apple has added a third lens to the iPhone 11 Pro model, matching the setup of three competitor cameras such as Samsung Electronics Co Ltd ( 005930.KS ) and Huawei Technologies Co Ltd [HWT.UL] already present on their flagships models.
But Apple also played catch-up on the phone, with some features such as "night mode", a setting designed to make low-light photos look better. Apple will add this mode to its new phones when it ships on September 20, but Google Pixel from Huawei and Alphabet Inc ( GOOGL.O ) have similar features from last year.
In making photos look better, Apple is trying to gain an advantage through a custom chip that powers its phone. During the launch of the iPhone 11 Pro, managers spent more time talking about its processor – called the A13 Bionic – than the specs of the newly added lens.
A special part of this chip, called the "neural machine", reserved for artificial intelligence tasks, is intended to help the iPhone take better and sharper shots in challenging light situations.
Samsung and Huawei also design custom chips for their phones, and even Google has a custom Visual Core silicon to help with its Pixel photographic tasks.
Ryan Reith, Vice President of IDC's Mobile Tracking Program at IDC, said he created an expensive game where only handset makers with enough resources to create personalized chips and software can afford invest in custom camera systems that set their devices apart.
Even very cheap phones now have two and three cameras on the back of the phone, he said, but chips and software play a huge role in whether the resulting images look stunning or so much.
"Today, owning a stack in smartphones and chipsets is more important than ever because the exterior of the phone is goods," Reith said.
The custom chips and software that power the new camera system take years. But in the case of Apple, research and development may later be useful in products such as augmented reality glasses, which many industry experts believe Apple is in the process of developing.
"Everything is made for the bigger story on the line – augmented reality, starting with phones and possibly other products," Reith said.
Report by Stephen Nellis in San Francisco; Editing by Lisa Shumaker