At CES 2017 augmented reality (AR) and virtual reality (VR) were everywhere. There were demos in many of the booths, and CES had devoted a segment of the South Hall to AR, VR, and gaming. Going by CES standards, you would expect that VR was already a pervasive technology. But in the market, AR is still very much in an experimental phase. That said, there were a lot of very interesting AR applications and products being shown at CES.
Vendors were doubling up on VR demos. (Photo: TIRIAS Research)
While Microsoft’s HoloLens is a mature development system at this point, probably the most exciting AR product was the Osterhout Design Group (ODG) D-9 and D-8 smartglasses running on Qualcomm’s Snapdragon 835 processor. The glasses showed an excellent combination of style, weight, and optical performance. While the glasses will start at $1000 and above, we are still in the early phases of smartglasses and these prices will come down over time as volumes increase. The unfortunate thing for the ODG glasses was that there was little content unique to the glasses that could be demoed. One fellow researcher noted that they just didn’t “do anything” that seemed to impress him. In contrast, Microsoft’s HoloLens has several industrial applications and demos already pre-canned. The ODG glasses run the Android OS and need custom AR applications. But I found the glasses compelling, with a bright display and good field of view.
ODG smartglasses demo (Photo: TIRIAS Research)
As a side note, I think it would be extremely compelling if Microsoft ports the HoloLens’ “Windows Holographic” operating system to the Qualcomm Snapdragon 835, as the company already plans to bring Windows 10 to the chip. This would create a really competitive mobile platform where the heterogeneous compute functions of the Snapdragon 835 could be utilized to create a truly spectacular mobile AR platform. The Snapdragon 835 is well designed for AR and VR applications. With the internal DSP processor on on-chip sensor hub, the 825 can support six degrees of freedom using a monocular camera and the sensors. The 10nm process makes the 835 a smaller die and (35% smaller) package, with higher integration, and lower power. For example, the 835 has integrated 802.11ac and Bluetooth 5 connectivity. The chip’s CPU, GPU, and DSP can all be used for client-based machine learning. These factors allowed the 835 to fit into the ODG smartglasses without a lot of bulk.
The Snapdragon 835 (upper left) compared with the Snapdragon 820 and a U.S. penny. (Photo: TIRIAS Research)
In addition to the ODG glasses, I also noted a number of prototype glasses from other vendors at CES. One prototype uses a Sony light guide technology, but the design still needed some work as the prototypes were unbalanced and the display was only monochrome. But these types of AR glasses are more useful for industrial applications, not consumer applications.
Prototype AR Glasses (Photo: TIRIAS Research)
There were a number of companies showing technologies that enhance AR/VR headsets. One company, Occipital, was showing its Structure Sensors depth sensor technology attached to a headset running a demo program on iOS. This depth sensor is part of a critical technology for mixed reality applications where AR smartglasses or VR headset can scan a room area and map items into the virtual world. Microsoft’s HoloLens uses a custom chip derived from the Xbox Kinect sensor controller. Intel has its own RealSense technology that is has been used in a standalone device but is also integrated into its prototype called Project Alloy “blended” reality VR headset. And Google’s Project Tango also has similar technology which has been integrated into a couple of smart phones. The goal of each of these technologies is to scan the living area around the user, map that into a 3D world, and then blend or overlay computer rendered images or items into that virtual world. The sensors can also use the information to protect a user from bumping into nearby objects while encased in a VR headset. Augmented reality can be just an overlay image like Pokémon Go or a heads-up display, but this is a critical technology for any app that needs to understand the physical world around the user.
Occipital Structure Sensors VR demo attached to an iPhone. (Photo: TIRIAS Research)
On the high-performance VR headset front, the two leading candidates for PC VR are Oculus and HTC. At CES, HTC introduced a new technology that allows third-party devices to be mapped into the Oculus VR world. This Vive Tracker looks a lot like an ashtray but has sensors that allow it to track any object it’s attached using lighthouse stations. The demos included the base of a baseball bat, gloves, a prop gun, and a virtual camera. With the Tracker, any prop can be turned into a virtual asset. In addition, HTC showed off a wireless headset module from TPCast and announced a partnership with Intel to use WiGig chips to build HTC’s own wireless HMD. Unfortunately, the Intel WiGig demo was not available right after the press conference, but the TPCast wireless module was. The $249 TPCast module has a 6000mA battery, which gives you two hours of use.
HTC is also expanding its interests beyond consumer, to education, health and wellness, manufacturing, arts and culture, training, and automotive. These markets may be able to support VR in the short term, much like industrial uses have supported AR glasses for years. HTC has Vivestudios to fund and publishing content.
Another way to bring VR to the people is through internet cafes, malls, arcades, and amusement parks. HTC Viveport.com is creating a way for arcades to get affordable, legal content with 300 titles for arcades. The first is Viveland in Taiwan.
HTC Vive Tracker attached to a baseball bat for VR batting training. (Photo: TIRIAS Research)
VR was so important at CES that Intel devoted most of its press conference to VR. In fact, Intel went to the expense of setting up approximately 260 VR workstations and gave individual VR experiences those lucky 260 press attendees that were admitted into the main hall. The rest of the press had to go to overflow room and watch a video feed. Intel ran through several demos and certainly made an impression on the audience regarding interesting VR applications. Although the demos were mostly static, it did get people standing around and moving around with headsets on. That could have caused some trouble but Intel had invested in putting lots of support people in the room - approximately one support person for every six attendees. The result is it was a great experience for those of us who made it into the press conference, but probably not so great experience for those left in the overflow room.
Intel press conference with multiple VR set ups. (Photo: TIRIAS Research)
The demos included a 360° video of a wingsuit jump in Grand Canyon, a live 360° view of a live basketball game using Intel’s Replay Tech, a live feed of a solar panel inspection from a drone, and a volumetric rendering of a rural Vietnam locale with a river, waterfall, and a water buffalo. With volumetric rendering, the Vietnam video gave the scene a greater sense of presence and depth.
And when all was said and done, I’m not altogether all sure how it helps Intel. Each of these VR configured workstations needed a discrete graphics chip to support VR, meaning that the real beneficiaries of VR are still the GPU manufacturers - AMD and Nvidia.
Intel CEO Brian Krzanich did pitch its own blended reality headset called Project Alloy. Alloy is nearing the stage where Intel would release designs for 3rd parties to manufacture (expected in Q4). It uses a 7th Gen Core processor, the RealSense depth sensors, an unnamed vision processor, stereoscopic fisheye cameras, and other motion and tracking sensors. The demo of Alloy included a multiplayer living room game where all furniture is mapped and recast as objects in game.
And while there were demos at the Intel’s CES booth, due to scheduling, I did not get a chance to try it out. The challenge for Intel is that even running a Core i7 processor in Alloy, Intel’s graphics are not optimized for high performance VR. The standalone headset is also not optimized to have a discrete graphics chip like other PC-based VR systems either. This may be why Alloy is pitched as a blended reality solution and not a pure VR solution. Blended reality means that it uses stereoscopic video from outside the headset and blends in/overlays objects into that real-world video stream. This technique requires less raw GPU performance because the GPU does not have to render the entire virtual world inside the headset.
Intel Project Alloy merged reality headset. (Photo: TIRIAS Research)
The biggest challenges for VR at this stage is the classic chicken and egg problem. The number of VR units sold, not counting simple mobile solutions like Gear VR, Google Cardboard, Google Daydream View, and reams of cheap knockoffs, but high performance VR shipment volumes are currently very low. High performance VR units such as the HTC Vive and the Oculus Rift are also still shipping in low quantities and that will limit the total market for third-party content. But without killer third-party content, it’s hard to sell a VR headset and get people to go through the VR set up process. HTC and Oculus are focused on supporting new content and HTC did make a point to say that there are roughly a 1,000 titles that run on the Vive.
VR though, unlike standard PC gaming, requires more work by the user to set it up and use. It’s unreasonable to expect anyone but the most devoted fan, to dedicate an area walk-around VR. The users need to set up the “lighthouse” towers for Vive or the sensors for Rift and you need to put on a headset and grab trackers – so it’s a little more involved than just turning on a PC in grabbing the keyboard and mouse and go. Even mobile VR kits like Google’s Daydream requires set up time.
I’m not saying is that there isn’t some interesting and compelling content in VR. My experience with HTC Vive and Oculus Rift has shown me that there’s a lot of experimenting right now and is some very unique environments and creative content that can only be done effectively in VR. One of the applications that Intel talked about is virtual travel, being able to go to places that are remote or difficult to get to and put yourself in those situations and at those places while still getting a sense of a real environment by having things like pictures mapped to a virtual 3D object in creating the ability to do parallax by moving around and seeing the view change that gives you a real sense of a 3D environment, not just the typical 3D cameras you see at sport events where you have no ability to change your view your stuck in one spot and it’s monocular 360 video. Of course there are some interesting games and there’s been a lot of experimenting with unique game styles. With VR, your movements can be incorporated into the game creating some unique situations. One example is Superhot VR, where you can control time and experience “bullet time” like The Matrix movie.
Another up and coming AR technology is Google’s Project Tango. The latest smartphone to support it is the ASUS ZenFone AR Android smartphone with the Snapdragon 821 processor. Tango requires five sensors: a high quality rear cam, accelerometer, gyro, depth sensor, and motion tracking sensor. The depth and motion tracking sensors are specific for Tango. Using all the sensors, Tango can scan and map indoor environments for navigation and for AR. The ASUS phone has a slim design with a 5.7in display.
ASUS ZenFone AR phone’s Project Tango sensors. (Photo: TIRIAS Research)
While Tango is off to a slow start, according to those close to the developments, Google has an ecosystem plan that leverages Tango mobile is just a starting point. Using Tango, indoor navigation is possible to map and could be the killer app that puts Tango on every smartphone.
Augmented reality and virtual reality technologies have the staying power and excitement in the industry. They both continue to get investments, despite the slow development. For those of us that have used VR, there’s little doubt the technology is compelling, but the challenge remains to make it more mainstream. AR, while well established in certain industrial application, looks to have a long way to go for mainstream consumer uses. But, AR is potentially the more disruptive and powerful technology because it can more easily be integrated into our day-to-day life. Expect to see even more AR and VR at Mobile World Congress in a few weeks and then at 2018 CES – these technologies are not a fad.