Unreal Engine, Epic Games, Vicon. She's not real but she's rendered in real-time.
How close are we to being able to participate in movies, rather than just watch? When Netflix and Steam aren't something you turn on but something you step into, shows and games will be immersive; environments and characters will be interactive. An amazing confluence of AI, graphics, and hardware is on the brink of transforming entertainment as we know it. A great deal of such technology was on display at the 2018 Game Developers Conference in San Francisco. Here are some takaway conceptions that might just make you super stoked about the future of immersive visualization.
The science fiction sensation Ready Player One will soon become a Steven Spielberg picture. That means the idea of the Oasis, a virtual reality universe central to the future and more vast than the physical Earth, is about to become a household concept. Just imagine the possibilities of a digital cosmos with infinite clubhouses and vistas, its own economy, and untold expansions in the possibilities of interaction. What would be necessary for such a place to take form IRL?
First, who even has a VR headset? Not enough people. The current models are amazing in experience but less than convenient to setup. The industry is moving toward wireless headsets with batteries built in, as evidenced by Oculus's only public demonstration: the unreleased Go headset. It's a $200 unit with a battery that lasts half a day. Enough to watch several episodes of Rick and Morty on a virtual space cruiser or play a game with friends.
Amy Sterling. Settlers of Catan in Virtual Reality
VR with friends is the VR that will stay. Suddenly, geography doesn't dictate who hangs out. Watch as interactions blossom beyond a 2D news feed. For example, take Catan in VR. A popular board game becomes a completely natural social experience that unfolds in a stunningly beautiful lodge over a 3D animated Catan gameboard. If you look closely, tiny sheep wander and sulking thieves prowl.
It's amazingly intuitive to grab and move pieces; to give a thumbs up and roll dice; to shift your UI around and lock it in place. And it's really fun. Still, the social VR experience of Oculus, the headset owned by Facebook, a company worth $450 billion, leaves much to the imagination. You exist as a floating head with disconnected hands. No microexpressions are translated into VR, which means movement and voice are your only means of communication.
It's surprising how much you can infer just by head movements and finger-level hand tracking. And haptics is a whole other story. A vibration across a vest or gloves becomes an explosion, a portal, an enhancement of the experience. Haptics are coming and they'll let you feel the virtual world.
Thankfully, that drive of humanity to ever-expand beyond the present yields impressive advancements of everything, including simulated characters in VR. A landmark collaboration between Unreal, Qualcomm, Nvidia, Tencent, Vicon, and several researchers resulted in a photorealistic digital human that can be "driven" aka rendered in real-time by a real human. I'll have to write a separate blog on this. Until that happens, please enjoy a demo video below and watch the State of Unreal '18 to learn more about how this was accomplished.
For now, renders like this require a supercomputer to run and an actress wearing a wild motion capture suit plus a front-facing camera. But the good news is this is possible now! Someday it'll be available on the reg, which means your avatar could look just like you or be literally out of this world, and either way eventually it'll be detailed enough to have existed.
Why can't we have this now? Real-time rendering is basically a physics simulation and as it gets more realistic, we need more powerful machines than most people have today. The real-time render in the video above runs at 60 Hz on 5 PCs each running the new snapdragon chips and top of the line Nvidia GPUs.
Amy Sterling, Tobii. Tobii eye tracking in Vive VR
While we're still several generations of chips and CPUs away from photorealistic avatars for all, one company is on the brink of bringing one of the primary unspoken conversational cues into VR: eye movement. Tobii eye tracking gave a demo that detects your gaze. It may seem menial, but beyond communicating things like what you're looking at and gut emotional reactions, gaze in a virtual world can do things like can trigger an object's active state. It can also perform fovial blur, concentrating your compute power by rendering high resolution images only where you are actively looking. This can dramatically improve visual quality.
Beyond a VR headset and believable avatars, the Oasis also requires an oasis to land in! Basically we'd need some club houses for people to land in. Or perhaps vast ocean vistas or alien planets or.. infinite spaces, really. Turns out there is already a place you can build a virtual clubhouse in VR. Rec Room, one of if not the largest social VR hub, introduced room-building just three months ago and already people have created 35,000 rooms. I hear that soon there will be virtual portals to anywhere.
The Oasis is coming. When it arrives, you'll not just be able to see it. You'll be able to walk through it, to feel it, and to explore it with others. I for one can hardly wait for it to begin.