Take for example the new "Deep Fusion" computer photography feature, which Phil Schiller described as "very cool". It's an image processing system that touches the A13 Bionic's neural motor and uses machine learning. According to Apple, this system will "process pixel by pixel images, optimizing for texture, detail and noise in every part of the image." Deep Fusion will be available later this fall, so we don't yet know how effective it can be. Apple has shown sample photos of its Night Mode tool, which will enhance photography in low light and these results looked impressive.
This latter feature is the most obvious example of Apple's attempt to outperform its competitors. If you recall, Google's Night Sight launched last November and allowed relatively clear shots to be taken in near-total darkness. And Google wasn't even the first to try it, it was simply the most effective. Huawei, LG and Samsung have offered their own feature in previous flagship phones with varying degrees of success. Apple's night mode promises to do almost the same thing, though how well it works remains to be seen.
It wasn't always that way. In the midst of a megapixel race, when smartphone makers focused on tapping into sharper sensors on their phones, Apple did something really different and thoughtful. It stopped at 1
When it adopted two cameras with the iPhone 7 Plus, Apple also chose a more convincing setting than the competition currently used. it went with a telephoto lens as a secondary camera rather than a monochrome one for details like the Huawei P9 or the wide-angle option of the LG G5. Apple's approach soon became the most popular pairing in the industry. Today, however, Apple is seen as lagging behind Samsung, Huawei and even LG in trendsetting, not to mention setting them.
On the hardware front, Apple was late to jump into the ultra-wide trend this year. He added cameras with a 120-degree field of view to the three new iPhones. LG was one of the earliest to try this concept when it added a super wide lens to the G5 in 2016.
At first, it seemed tricky, but when people (including myself) began to see the versatility it brought to photography, smartphones, LG's rivals followed suit. The Galaxy S10, S10 + and Note 10, as well as the Huawei P30 Pro, now also have extremely wide options. Apple is just the latest to get on board. (It's worth noting that for all the praise thrown at Google for its wisdom in image processing, Pixels does not yet have ultra-wide-angle lenses.)
Now the updated Apple Camera Interface, which allows the user to see the wide-angle view while creating a frame with the main camera, is unique. But just by adding a third, ultra-wide sensor, Apple doesn't really do anything else.
Apple also does not borrow ideas from smartphones. With the Apple Watch Series 5, the company also introduced the new Always On Display, which means wearables will show time, well, all the time. Yes, almost all other smart watches with color touch screens have had this for some time. Apple's new health tracking feature also follows in the footsteps of Fitbit and Garmin. Certainly Samsung and Google have not yet integrated this, so Apple is not the slowest in this race, but it is certainly not breaking new ground.
Innovation comes with a risk measure and it's understandable that Apple wants to play it safe. The wait and wait attitude of the company is not news – many have called out how far it is compared to its competitors. And honestly, it's been a while since Apple surprised the industry with a fresh idea that made us all go "Aunty, why didn't anyone think of this before?" Sometimes you almost forget that the iPhone was once the leader of the package, not just a member of it.