There are all sorts of concessions that your mind accepts when you’re playing a game. You know that you’re not really that third person avatar, you know you’re not the one interacting in first person. Everything you do is translated through the controller, and yet we are still immersed and still present.
In VR, all this is different. Your mind can be tricked into believing that those hands are your hands, that when you reach out and grab something, even though you’re pulling a trigger on a controller, that you’re really holding it. This presents interesting challenges for designing a VR game, chief among them how to work with the mind, rather than against it.
“Essentially, we break the problem into two groups: data that you shouldn’t fake, and data that it’s ok to fake,” says Max Weisel, the founder of experimental VR studio NormalVR. “Data that you shouldn’t fake are things like facial expressions and eye positions.”
Weisel explains that these are things our brain is very good at reading, and with the limits of technology, we’re not very good at reproducing them virtually. So the brain gets confused, misinterprets data, and false information is created. This incongruity is what leads to things feeling ‘off’ or ‘wrong’.
“Data that it’s ok to fake ends up being stuff like your body and arms,” he continues. “The only requirement is that you’re not trying to mimic the user perfectly. One of the first full-body avatars we tried out was the body from Tyler Hurd’s Old Friend. Up until that point, I was convinced that you shouldn’t show a body.”
Trying out, the body from Old Friend, Weisel says that “the experience was 100x better than any of the head/hands avatars I had tried previously.”
With Playroom VR, the free set of minigames that comes with Sony’s PSVR, creative director and producer Nicolas Doucet found a similar freedom when it came to creating player bodies. “We made up a rule that looking at your own body as a VR player should always be fun, or we shouldn’t show it," he says. "We thought about details that would look interesting from a VR player position. The best example is the monster’s tail, in Monster Escape. The VR player becomes a huge godzilla-like creature and by looking behind themselves, they can see a tail wagging. That usually causes some nice reactions.”
These ideas aren’t necessarily new, just perceived in a new way. The Uncanny Valley is a well-known concept, albeit usually applied to the representation of non-player characters than the players themselves. But in the pursuit of realistic VR avatars, the level of fidelity has to be that much higher to be successful.
Weisel tells me it’s why “our avatar has a sort of Baymax-like look. The eyes are designed not to move and the face is designed not to require a facial expression. Based on the rest of your body language/pitch of your voice, your brain has a pretty easy time filling in this information.”
Which ties into the major revelation Weisel experienced while messing around with that noodly-armed avatar from Old Friend: “The more I played it, the more I realized when I looked down at myself: I didn’t think this was my body, but rather a body I was puppeteering," he says. "Almost like I was wearing a noodly-armed costume. Not only did it work well, but I found myself dancing and enjoying myself far more. This body was more fun than the boring one I’ve been inhabiting for the last 25 years.”
Reframing the VR avatar concept as one of puppetry seems to render a lot of the issues much more manageable. If, as a designer, you can convince the brain that it’s merely controlling this incongruous form, it has far less difficulty dealing with the discrepancies.
Timothy Johnson is developing Sluggy’s Fruit Emporium, a VR game where you manage an alien fruit stand in the guise of Sluggy, a... well, slug-like creature. “By default, you see a faint circle that represents the pupil of the eye and towards the outside of the field of view is the edge of the eyelid," he says. "Yes, the eyelid blinks and you can see it. A lot of players don’t even notice the blink, or they’ll notice the first time it happens and then completely forget the rest.”
The idea behind the conceit is to give players the subconscious markers that they’re in a ‘Sluggy suit’ rather than actually being a giant purple slug monster.
“This all ties back into how good our brains are at filling in gaps in information without us noticing," he adds. "When your eyeball moves, your brain just ignores the blurry image. And think about how many times you blink during a day; 99% of them go totally unnoticed. When I showed the game at EGX Rezzed, we had maybe 500 or so people play; of those 500, only one said they didn’t like the blinking and iris, but so many people said it helped them feel like they actually were Sluggy, or were puppeteering Sluggy.”
With Playroom VR, Doucet and his team incorporated that body incongruity as a fun aspect of the play. “One of the challenges with being non-human is that we wanted to tell the players: ‘Hey, this is what YOU look like, right now!’, but that’s very hard to do from a first person view. So we incorporated many mirrors in the game, which were in fact small screens showing what the TV players were seeing. For example, in Monster Escape, as soon as you come out of the water the first thing you see is a screen attached to a skyscraper with you on it. As you move, it reacts like a mirror. Not only does it explain what kind of character you are in VR, but it’s also very fun to watch.”
These various difficulties with creating believable, high fidelity human avatars lead to a fairly simple solution: don’t.
“For me, there’s two issues I have with human avatars,” Weisel tells me. “Firstly, we’ve spent our whole lives learning how to analyze faces and body language. Our brain perceives false information as fact. You might be talking to someone in VR that is just casually looking at you, but in VR if their face isn’t relaxed to match, you get the feeling they're staring at you.”
He tells me about Facebook’s recent attempts to tackle this problem with their VR chat-app. “At first, the demo looks incredible," he says. "When they laugh, their avatars laugh. It even picks up expressions that don’t have any vocal feedback like smiling. However, if you dig deeper, these facial expressions aren’t automatic. You press a button on the controller to trigger it. Learning that kind of killed me, to be honest. Facial expressions for us are automatic and something we don’t have to think about, so having to remember to push a button to make your face match isn’t something I think is a reasonable solution.”
So naturally, sidestepping human avatars entirely removes the problem of our excellent ability to analyze facial expressions. “For the same reason you can empathize with a fish from a Pixar movie, you can achieve the same sort of empathy with a thousand times less effort using non-human avatars. And that extra effort can be put into making these avatars more interesting and fun.”
Which leads Weisel to his second point: “Human avatars are boring! I’ve been a human for my whole life… I see myself in the mirror every day and not much has changed. When you puppet a fictional character, you get to inhabit a body that’s far more interesting, because it’s something you haven’t tried before.”
That’s not to mention the benefits having a non-standard body allows you, in terms of design and flexibility. With Sluggy’s Fruit Emporium, Johnson has been able to create a highly adaptable avatar in a number of ways. Sluggy itself is extremely stretchy, which first of all allows Johnson to adapt to different player heights, but also provides more esoteric functionality.
“Sluggy’s arms connect a spline between the hand position and the inferred shoulder position so they’re able to stretch and change shape fluidly for any arm length or hand position," he says. "The spline allows nice deformation well beyond a person’s normal reach. This is great for roomscale VR and it’s really the core that makes Sluggy’s Fruit Emporium possible. All the customers are beyond the limits of my 3x3m room, but because Sluggy’s arms can extend beyond my own, I can reach them just fine. A human avatar would never be able to do this.”
Aside from the technical freedom a non-human avatar allows, it’s also inclusive by nature. “When designing and talking about Sluggy I’ve been very careful not to add any traditional gender signifiers,” Johnson tells me. “Many games only have one gender of playable characters, and that’s usually male. Even controlling and avatar on screen that doesn’t match your own gender puts a lot of people off. Now imagine that in VR, where this differently gendered body isn’t just pixels on a screen but actually lines up with your own body when you look down at it.”
As for the possibilities of what we could see in VR? Johnson is more than happy to speculate.
“There’s as much scope as you want, for any body part you don’t know the exact position of you can guess, or fill in the blanks with regular animation or an IK rig blended in.” (He tells me Rootmotion have just released a fantastic IK rig for VR).
He throws out some ideas off of the top of his head:
- * Flying creatures, like bats.
- * Many legged beasts, think a spider-human centaur.
- * Some games have a prop hunt mode where players become any old inanimate object and try to hide. That’d work great in VR multiplayer!
- * If you use a button to disconnect one limb and reconnect another you could control far more than two arms. (Octodad VR anyone?)
- * A whole swarm of small creatures.
- * Giant robots or mechs.
With the freedom on display in both Sluggy's Fruit Emporium, and in the experiments NormalVR is trying out, it's not at all unrealistic to think any or all of these ideas could be rendered possible in VR. It's all a problem of implementation, rather than conception; the further we move away from realism, as counter-intuitive as it seems, the easier it becomes for the mind to latch onto whatever we present it with. So to any budding VR developers: go nuts.