Home https://server7.kproxy.com/servlet/redirect.srv/sruj/smyrwpoii/p2/ Technology https://server7.kproxy.com/servlet/redirect.srv/sruj/smyrwpoii/p2/ That’s why Apple believes it’s an AI leader – and why it says critics have it all wrong

That’s why Apple believes it’s an AI leader – and why it says critics have it all wrong



Machine learning (ML) and artificial intelligence (AI) now penetrate almost every feature of the iPhone, but Apple does not advertise these technologies like some of its competitors. I wanted to know more about Apple’s approach, so I spent an hour talking to two Apple executives about the company’s strategy – and the implications for the confidentiality of all new features based on AI and ML.

Historically, Apple has not had a public reputation as a leader in this field. This is partly because people associate AI with digital assistants, and reviewers often call Siri less useful than Google Assistant or Amazon Alexa. And with ML, many technology enthusiasts say more data means better models, but Apple isn’t as well known for collecting data as, say, Google.

However, Apple has included special hardware for machine learning tasks in most of the devices it supplies. The functionality driven by machine intelligence is increasingly dominating key notes, where Apple executives take the stage to introduce new features for the iPhone, iPad or Apple Watch. Apple’s introduction of silicon Macs later this year will bring many of the same developments in machine intelligence to the company’s laptops and desktops.

Following Apple Silicon’s announcement, I spoke at length with John Janandrea, Apple’s senior vice president of machine learning and AI strategy, and Bob Borchers, vice president of product marketing. They described Apple’s AI philosophy, explained how machine learning drives certain functions, and argued passionately about Apple’s AI / ML strategy on the device.

Contents

What is Apple’s AI strategy?

Both Janandrea and Borcher have joined Apple in the last few years; each of whom has previously worked at Google. The Borchers actually joined Apple over time; he was senior marketing director for the iPhone until 2009. And Google’s definition of Giannandrea at Apple in 2018 was widely accounted for; he was the head of AI at Google and Search.

Google and Apple are quite different companies. Google has a reputation for participating in the AI ​​research community, and in some cases as a leader, while Apple did most of its work behind closed doors. This has changed in recent years as machine learning provides multiple features in Apple devices, and Apple has increased its commitment to the AI ​​community.

“When I joined Apple, I was already an iPad user and I loved the pencil,” Janandrea (who switches from JG to colleagues) told me. So, I would follow the software teams and say, “Okay, where’s the handwriting machine learning team?” “And I couldn’t find him. “It turned out that the team he was looking for did not exist – a surprise,” he said, given that machine learning is one of the best tools available for the function today.

“I knew there was so much machine learning that Apple had to do, it was surprising that not everything was done. And that has changed dramatically in the last two to three years, “he said. “I really honestly don’t think there’s an iOS or Apple experience that doesn’t transform through machine learning over the next few years.”

I asked Gianandrea why she thought Apple was the right place for him. His answer was doubled as a brief summary of the company’s AI strategy:

I think Apple has always stood for this intersection of creativity and technology. And I think that when you think of building intelligent experiences, having vertical integration, all the way from the applications, to the frames, to the silicone, it’s really essential … I think it’s a journey and I think it’s the future of computing devices what we have is that they are smart and that this smart species is disappearing.

Borchers also responded, adding, “This is obviously our approach to everything we do, which is, ‘Let’s focus on what’s good, not how you got there.’ And in the best cases it becomes automatic. It disappears … and you just focus on what happened, as opposed to how it happened. ”

Speaking again about the handwriting example, Janandrea made it so that Apple is in the best position to “lead the industry” in building features and products powered by machine intelligence:

We made the Pencil, we made the iPad, we made the software for both. Just unique opportunities to do a really good job. What are we doing a really good job for? Letting someone take notes and be productive with their creative thoughts on digital paper. What interests me is to see how this experience is used on a global scale.

It contrasts this with Google. “Google is an amazing company and it has really great technologists,” he said. “But their business model is fundamentally different, and they’re not known for the consumer experience that is used by hundreds of millions of people.”

How does Apple use machine learning today?

In recent marketing presentations, Apple has tended to credit machine learning by improving some features on the iPhone, Apple Watch, or iPad, but rarely goes into much detail – and most people who buy iPhones haven’t watched these presentations anyway. Contrast this with Google, for example, which puts AI at the center of much of its user messaging.

There are many examples of machine learning used in Apple software and devices, most of which are new only in the last few years.

Machine learning is used to help iPad software distinguish between a user who accidentally presses his palm against the screen while painting with an Apple Pencil, and a deliberate press designed to provide input. It is used to monitor user usage habits to optimize device battery life and charge, both to improve the time users can spend between charges and to protect long-term battery life. Used to make recommendations for applications.

Then there is Siri, which is probably the only thing that any iPhone user would immediately perceive as artificial intelligence. Machine learning leads to several aspects of Siri, from speech recognition to Siri’s attempts to offer useful answers.

IPhone owners may also notice that machine learning is behind the Photos app’s ability to automatically sort photos into pre-made galleries or give you exact photos of a friend named Jane when her name is entered in the app’s search box.

In other cases, few users can understand that machine learning works. For example, your iPhone can take several pictures in quick succession each time you touch the shutter button. The ML-trained algorithm then analyzes each image and can compile in one result what it considers to be the best parts of each image.

AI is behind Apple's handwashing aid in the Apple Watch.
Zoom in / AI is behind Apple’s handwashing aid in the Apple Watch.

Makhkovech himself

The phones have long included signal image processors (ISPs) to improve digital and real-time image quality, but Apple accelerated the process in 2018 by getting the ISP on the iPhone to work closely with Neural Engine – the recently added machine learning – focused processor.

I asked Giannandrea to point out some of the ways Apple uses machine learning in its recent software and products. He gave a list of examples of laundry:

There are a whole bunch of new experiences that are driven by machine learning. And these are things like language translation or dictation of the device, or our new health features, like sleep and hand washing, and things we’ve played in the past about heart health and things like that. I think there are fewer and fewer places on iOS where we don’t use machine learning.

It is difficult to find a part of the experience in which you do not make any predictions [work]Like app predictions, or keyboard predictions, or modern smartphone cameras, they do a lot of machine learning behind the scenes to understand what they call “sliency,” which is like, what’s the most important part of the picture? Or, if you imagine blurring the background, you make a portrait mode.

All of these things take advantage of the basic machine learning features that are built into Apple’s core platform. So, it’s almost like “Find me something where we are no use of machine learning. “

Borchers also cited accessibility features as important examples. “They are fundamentally accessible and possible because of that,” he said. “Things like the ability to detect sound that changes in the game for that particular community are possible because of the investment in time and the capabilities that are built in.”

In addition, you may have noticed Apple’s software and hardware updates over the past few years, highlighting augmented reality features. Most of these functions are possible thanks to machine learning. Per Giannandrea:

Machine learning is widely used in augmented reality. The difficult problem is what is called SLAM, so simultaneous localization and mapping. So, trying to find out if you have an iPad with a lidar scanner on it and moving, what does it see? And build a 3D model of what is actually seen.

This today uses in-depth training and you need to be able to do it on a device because you want to be able to do it in real time. It wouldn’t make sense to swing your iPad around, and you might have to do it in the data center. So, in general, I would say that, in my opinion, in-depth training in particular allows us to move from raw data to the semantics of that data.

Increasingly, Apple is performing machine learning tasks locally on the device, on hardware such as the Apple Neural Engine (ANE), or on the company’s custom graphics processors (GPUs). Giannandrea and Borchers argue that this approach is what makes Apple’s strategy different from its competitors.

Sheet image from Apple


Source link