Looks like the 2017 idea keeps on burning
“…teaching AI software to learn more quickly from “few examples and few experiences.”
…Apple has included dedicated hardware for machine learning tasks in most of the devices it ships. Machine intelligence-driven functionality increasingly dominates the keynotes where Apple executives take the stage to introduce new features for iPhones, iPads, or the Apple Watch. The introduction of Macs with Apple silicon later this year will bring many of the same machine intelligence developments to the company’s laptops and desktops, too.
Increasingly, Apple performs machine learning tasks locally on the device, on hardware like the Apple Neural Engine (ANE) or on the company’s custom-designed GPUs (graphics processing units). Giannandrea and Borchers argued that this approach is what makes Apple’s strategy distinct amongst competitors.
I asked Giannandrea why he felt Apple was the right place for him. His answer doubled as a succinct summary of the company’s AI strategy:
I think that Apple has always stood for that intersection of creativity and technology. And I think that when you’re thinking about building smart experiences, having vertical integration, all the way down from the applications, to the frameworks, to the silicon, is really essential… I think it’s a journey, and I think that this is the future of the computing devices that we have, is that they be smart, and that, that smart sort of disappear.
Machine learning is used to help the iPad’s software distinguish between a user accidentally pressing their palm against the screen while drawing with the Apple Pencil, and an intentional press meant to provide an input. It’s used to monitor users’ usage habits to optimize device battery life and charging, both to improve the time users can spend between charges and to protect the battery’s longterm viability. It’s used to make app recommendations.
Then there’s Siri, which is perhaps the one thing any iPhone user would immediately perceive as artificial intelligence. Machine learning drives several aspects of Siri, from speech recognition to attempts by Siri to offer useful answers.
Savvy iPhone owners might also notice that machine learning is behind the Photos app’s ability to automatically sort pictures into pre-made galleries, or to accurately give you photos of a friend named Jane when her name is entered into the app’s search field.
In other cases, few users may realize that machine learning is at work. For example, your iPhone may take multiple pictures in rapid succession each time you tap the shutter button. An ML-trained algorithm then analyzes each image and can composite what it deems the best parts of each image into one result.
Really is the take Apple always took, let’s move forward for a better Siri
Before J.G. arrival : Apple has also used machine learning to provide many other features and services. Examples include providing a list of apps a user is likely to open, identifying a caller who isn’t in a user’s contact book by analyzing e-mail data, selecting news stories, finding faces and locations in photos and figuring out whether an Apple Watch user is exercising.
Since J.G. arrived: What I object to is this assumption that we will leap to some kind of superintelligent system that will then make humans obsolete…