Apple’s entry into augmented reality is gathering pace at an amazing rate, says one of its vice-presidents visiting Australia.
In an interview with The Australian yesterday, Apple vice-president of product marketing Greg “Joz” Joswiak said the enthusiasm of Apple’s development community building augmented reality (AR) applications had been “unbelievable”.
“They’ve built everything from virtual tape measures (to) ballerinas made out of wood dancing on floors. It’s absolutely incredible what people are doing in so little time.” Mr Joswiak yesterday spoke with The Australian on what is the 10th anniversary of the release of the first iPhone.
He said users could roll out a virtual tape measure in AR to work out the distance between two points as seen through the phone’s camera.
Ikea has been developing an app that lets customers test how a couch in their showroom would look and fit in the customer’s lounge room. The couch is superimposed virtually over the real lounge room.
Apple was not talking about any plan to build AR glasses or a headset, but would instead promote its use on the iPhone and iPad, he said.
“I think there is a gigantic runway that we have here with the iPhone and the iPad. The fact we have a billion of these devices out there is quite an opportunity for developers. Who knows the kind of things coming down the road, but whatever those things are, we’re going to start at zero.”
He said in the commercial space, AR applications would evolve for shopping, furniture placement, education, training and services.
With Apple’s personal assistant HomePod coming in December, Mr Joswiak is confident there will be plenty of compatible HomeKit devices in Australian homes that users can control by voice.
He said major home automation vendors had signed up to support Homekit. “We have a worldwide developer group that serves different markets and different regions for just that reason.”
He said Apple was developing new male and female voices for Siri, and that included new Australian voices. Some 350 million active devices used Apple’s Siri voice assistant globally, he said.
Apple also was developing machine learning across all its devices: “We’ve been implementing machine learning in our products long before it was fashionable to call it machine learning.” He said the keyboard in the original 2007 iPhone would modify the hit zones of each letter by observing how a user typed. “We used machine learning to learn what you typed back in 2007.”
He added that Apple was currently using it to improve the autocorrect feature of typing. For example, if you read an article and then send a message or mail, your phone could show autocorrect choices based on the vocabulary you had just encountered in that article, Mr Joswiak said.
Machine learning also was being used to interpret even “atrocious handwriting” made with Apple Pencil.