Lego's director of innovation, Martin Sanders, shows off new ARKit 2.0 features during Apple's WWDC event yesterday.
IN HIS TIME on stage at Apple’s annual developer conference yesterday, Craig Federighi did many of the things that people have come to expect of the company’s senior vice president of software engineering. He wore a blue shirt. He said "isn’t that amazing?" a lot. He undertook a demo with an oddly personal touch—in this case, using augmented reality to measure a photograph of himself as an infant. (No, Infant Craig did not yet have the glorious mane his adult self would one day be parodied for.)
But Federighi also did something that Apple isn’t in the business of doing: He all but confirmed reports that the company is working on a wearable AR device.
"T288," as the device is allegedly known at the company, may never come out. Or it may come out in 2020. And no, Federighi never mentioned it. Nor did anyone else. The words "wearable," "headset," or "smart glasses" were nowhere to be heard throughout the keynote. But in demonstrating Apple’s updates to its ARKit suite, which helps developers create augmented-reality apps and experiences, Federighi gave the WWDC audience—and the world—a clear sign that the future Apple is working toward isn’t one we’ll be looking at through our phone screens.
First, some background. The original incarnation of ARKit, which Apple unveiled at last year’s WWDC, was little more than a testbed. "Each app was an island," says Matt Miesnieks, CEO of 6d.ai, a company building software infrastructure for the so-called AR cloud. "You had a plane, you could play the game in that plane; that was it. The two things that were missing were the ability for multiplayer, so you and I could see that Pokémon in exactly the same place, and the other was persistence, so I could drop a sculpture in my living room and you could come in the next day and see it."
ARKit 2.0, which Federighi outlined yesterday, adds those qualities—persistence and "shared experiences," as he called it—and Apple’s onstage demos showed them off perfectly. There was an app through which people could use AR slingshots to shoot AR projectiles at each other’s AR wooden structures, like Angry Birds mixed with Pokémon Go. There was a Lego experience that turned a real-world Lego building into the cornerstone of an block-sized cooperative AR adventure in which all the other buildings, characters, and even vehicles, were virtual.
And thanks to a host of other ARKit updates like 3D object detection and more realistic lighting, the experiences looked more fluid, more lifelike, and more advanced than just about anything that’s been shown on a consumer AR device. (There was also Measure, a more utilitarian tape-measure app that led to the aforementioned baby picture.)
Yet, both experiences happened through an iPad. An iPad held at arm’s length. And that may be great news for chiropractors, but it’s also where Apple tipped its hand. "I think you can see an endpoint for mobile AR in terms of how long someone can use an AR feature holding their hand up," says Alexis Macklin, an analyst with Greenlight Insights, a market intelligence firm focused on virtual and augmented reality.
While Measure is a great indicator of how mobile AR can add value to people's lives today—you hold up your phone, you get the length of that couch or the height of those drapes, you're done—AR's more pervasive future likely hinges on being there when we need it, no pocket pullout required.
"There are some good use cases for mobile AR, but it’s not the end game," says Miesnieks.
Really, then, Apple’s emphasis on persistence and shared experiences reflects the AR we’ll get to in the next few years, not the AR that worms its way into our lives today. "The people working on sharing and persistence are doing God’s work and it’s really important to solve those problems in the long term, but it’s important for consumers to understand that those are not necessary conditions for AR to take root and be adopted now," says Tony Parisi, global head of VR/AR brand solutions at Unity Technologies. "They will become significant problems over time, as usage becomes more sophisticated." (The LEGO AR experience was built using Unity.)
Developer conferences are no longer just developer conferences; they’re live-streamed public-facing events. That gives companies like Apple a prime opportunity to seed two ecosystems at once. With the ARKit 2.0's ability to support persistence and shared experiences, Apple is giving its developers the tools to build applications and experiences that may not be perfectly suited to today’s devices.
But it’s also showing consumers what’s possible. Nothing mind-blowing, but nothing hypothetical either. This isn’t the grandeur of a Magic Leap promise, but an introduction to a technology that Apple clearly believes will redefine the way humans use computers.
"We already have evidence that Google is working on an AR headset," says JC Kuang, another analyst at Greenlight. "And the striking thing to me was how similar ARKit 2.0's feature set is to Google's ARCore refresh. The keynote gives us further confirmation that both companies have a similar roadmap."
That roadmap, of course, is just beginning. Which is where the developers—and those arm’s-length iPads—come in. "They’re pushing AR onto phones to make sure they’re a winner when the headsets come around," Miesnieks says of Apple. "You can’t wait for headsets and then quickly do 10 years' worth of R&D on the software."