This article is more than 1 year old

Every time Apple said 'machine learning', we had a drink andsgd oh*][

Yasss Steve i mean Tim... sorry... tell ush moar ab, ab, aboat aaye eye

WWDC While touting forthcoming operating system features at its annual developer conference on Monday, Apple made sure to mention machine learning and related AI-oriented terminology over and over.

Kevin Lynch, technology veep, talked about Siri, Apple's personal assistant software, becoming more proactive and more aware of watchOS activity through machine learning.

Craig Federighi, senior veep of software engineering, highlighted Safari's use of machine learning for intelligent blocking of browser tracking. He also talked about advanced convolutional neural networks improving facial recognition in Photos and making Siri smarter.

Federighi cited the utility of Apple's new Metal 2 graphics API for machine learning. And he said deep learning had been used to make Siri's voice sound more natural.

John Ternus, veep of hardware engineering, called out the compute power of Apple's forthcoming iMac Pro by noting how its Radeon Vega GPU would be helpful for machine learning.

Like alumni of prestigious schools who work their alma mater into every conversation, Apple execs made certain everyone was aware that their machines have learned.

A year ago, developer Marco Arment fretted about Apple's inaction on AI in a blog post. If the bets being made on AI by Amazon, Facebook, and Google pay off, Arment mused, "Apple will find itself in a similar position as BlackBerry did almost a decade ago: What they’re able to do, despite being very good at it, won’t be enough anymore, and they won’t be able to catch up."

AI isn't new to Apple, which has been nurturing Siri since 2011, but it didn't become a priority for the company until last October, when it hired Carnegie Mellon University professor Ruslan Salakhutdinov as its first director of AI research.

Apple in the past had trouble attracting accomplished AI researchers because its culture of secrecy rubbed academics the wrong way. It's not coincidental that Apple published its first AI research paper last December. And the following month, Apple joined the Partnership on AI, an industry consortium established last year to promote best practices.

Now, not only is Apple underscoring its use of AI-ish tech, but the company is bringing the technology it has been using for its Siri and Camera applications to developers. Its forthcoming platform updates – watchOS 4, tvOS 11, macOS 10.13, and iOS 11 – provide access to Core ML, a machine learning framework that includes a Vision API and a Natural Language API.

It's as if Apple is trying to tell us something. ®

More about

TIP US OFF

Send us news


Other stories you might like