Apple Vision Pro, iOS 17 are teeming with AI, but the word was never mentioned at WWDC — here’s why

It’s the year of AI. Yet, if you watched the annual developer keynote of Apple, the world’s most influential and valuable tech giant, you could hardly tell. 

At WWDC 2023, Apple raced through a string of what Tim Cook called some of the company’s “biggest announcements ever” in a presentation that lasted over two hours. It bid adieu to Intel chips in its computers once and for all, walked through the yearly OS updates for its half a dozen platforms like iOS 17 and macOS 14, and offered a peek inside a potential future beyond the iPhone with a pair of smart goggles. 

iOS 17

iOS 17 (Image credit: Future)

The one thing it didn’t talk about, however, was AI — the buzzword that has gripped the tech industry like no other in the last year. Just a month prior, it was the centerpiece of Google I/O, the search-engine giant’s annual developer conference where its execs mentioned “AI” nearly 150 times in two hours. At WWDC? Apple didn’t even say it once. 

Apple is very intentional about the words it uses 

This doesn’t mean Apple’s out of the AI race, though. Apple’s never been one to engage in hyped buzzwords, and this year was no different. In fact, it barely said augmented reality and outright skipped terms like virtual reality and “headset” in the hour it spent going over every intricate detail of its new Vision Pro mixed-reality headset.

Apple Vision Pro

Apple Vision Pro (Image credit: Apple)

 Apple did what it does best: convey how it’s deploying the tech behind AI in a more familiar and practical manner — one that will resonate with the average iPhone owner, who will, above all else, look to understand how any new update will trickle down to their routine experience. 

iOS 17, for example, adds a smarter autocomplete feature, which not only suggests the next words, but also can complete and improve your sentences. Instead of AI, it said the update is powered by a transformer language model and on-device machine learning — the same underlying architecture that’s also responsible for some recent viral generative AI tools like ChatGPT. 

Throughout the rest of the keynote, Apple similarly eschewed any buzzy acronym. iOS 17 can transcribe voicemail live while the other person is in the middle of recording it and Apple said it’s “thanks to the power of the Neural Engine,” the dedicated iPhone chip for AI-related tasks it introduced five years ago. The new Vision Pro headset offers an option called Persona, which builds your digital lookalike you can use on FaceTime calls. Yet, the word “avatar,” a word commonly used by Meta, was nowhere to be found. Apple instead chose to call it a “digital representation” of the wearer developed with the “most advanced ML techniques.”

Persona with Apple Vision Pro

Persona with Apple Vision Pro (Image credit: Apple)

Plus, when Apple previewed its new app for journaling, it waxed lyrical about how it can suggest what you can write by sifting through your virtual footprint like photos, music history, places you’ve visited, and more. It seemed like a job tailor-made for a personalized AI algorithm. But all Apple revealed was it depends on “on-device machine learning,” and “intelligently curates” suggestions from your information. 

Why does Apple avoid mentioning buzzwords?

There are a couple of possible reasons why Apple deliberately avoided mentioning AI even in passing. For one, AI as a marketing term is overused, and its meaning has been diminished. Most of the new products labeled as “AI” aren’t typically artificially intelligent. Machine learning, a more academic phrase, is better suited to the ways companies are using this tech to make their products smarter and more personalized. 

Two, AI comes with risks, and it’s likely Apple’s cautious approach is because it doesn’t have the answers for some of its most pressing concerns like what it would mean for human jobs and misinformation. When Google was demoing a new auto-write feature for Gmail, for example, I feared whether it would soon replace me as a writer online. Not so much with how Apple dealt with it. 

Speaking at an event in May, Tim Cook admitted the potential of AI is “huge” and “interesting,” but added that Apple must be “deliberate and thoughtful” on how it deploys it. 

At the same time, Apple is also far behind the advancements Google and Microsoft have made with AI. It’s been on a hiring spree for talent skilled in generative AI and for fixing its reportedly dysfunctional AI team

Plus, for Apple, the AI race also matters less than others. It’s a hardware company, unlike Google, and that’s especially apparent when you compare their developer keynotes. While Apple nerded out over technical hardware terms like “internet PCI expansion” and the number of cores in its new computer chips at its software event, Google dove deep into PaLM 2, the company’s latest model to build AI apps. 

However, that might not be the case forever. Apple will need conversational AI like the one that powers Google’s Bard chatbot, for example, to offer a reliable voice interface for its mixed reality headsets, and no longer depend on pre-programmed Siri responses. For now, though, it’s focused on equipping you with an autocomplete that automatically learns your swear words with or without AI. 


Source link