Virtual Reality is on the rise! From gaming to construction, this immersive tech has managed to find a wide range of applications and attract many tech enthusiasts in a span of a decade. By creating simulated environments that users can submerse themselves into, VR is set to gain more traction in the coming years. One of the key parts of this hyper-immersive world is NCPs (Non-Player Characters) or animated surroundings, which is also found in RPG games that are not VR based. Ever wondered how they are created? Well, it is all possible thanks to artificial intelligence (AI).
Whether as a gamer, you prefer shooting games like Counter-Strike, action-adventure games like GTA V or racing games like Need for Speed, every element in these games are designed and controlled by AI. The most common application for AI in video games is controlling NPCs. Often, NPCs appear intelligent, initiate or participate in a conversion, etc. This happens because developers design such characters using algorithms to make them look intelligent. One such algorithm is Finite State Machine (FSM) which was introduced to video game design in the 1990s. Using FSM, a game developer generalizes all possible situations that an AI could encounter, and then programs a specific reaction for each situation. If a VR game console is equipped with neuromorphic chip, it can collect data that can be used to make NCPs adapt according to the actions of the gamer.
While the earlier versions of gaming technology, involved using the keyboard, joystick, mouse, remote controllers, which often had restricted and repetitive movement of the game characters or avatars. In VR, the immersive world demands that the simulated world feels as realistic as possible, and it also promises an infinite number of body movements. Researchers from the University of Bath (UK) are leveraging AI to enhance movements of the VR game characters and make it seem real-life like. They teamed up with British game Studio Ninja Theory (Hellblade VR, DmC: Devil May Cry, Enslaved: Odyssey to the West) to implement a research project (where AI was used to program sword fighting) called AI Touché. Actors were hired to help with this project too.
They put on a spectacular costume and began to fight with swords. Recorded patterns of movement served as training material for AI Touché, where game developers could customize the behavior of swordsmen and other AI-controlled using several parameters, like combat experience and aggressiveness. In the trial phase, twelve volunteers took part in two three-minute battles in VR where first, they experienced the sword fighting using traditional technology, and later with AI Touché. The volunteers preferred AI Touché as it offered more complex movement and was less repeatable.
VR is already used in film industry, but have you wondered what a VR film would be like?
Recently, Toronto-based studio Transitional Forms and the National Film Board of Canada made a VR movie called Agence. It used reinforcement learning (RL) to control its animated characters, implying the next level of the future of filmmaking. Reinforcement learning is a subset of machine learning concerned with how software agents ought to take actions in an environment to maximize the reward.
Director, Pietro Gagliano and producer David Oppenheim revealed to MIT Technology Review that the basic plot of film revolves around a group of creatures and their appetite for a mysterious plant that appears on their planet. “Can they control their desire, or will they destabilize the planet and get tipped to their doom? Survivors ascend to another world. After several ascensions, there is a secret ending,” says Oppenheim. The movie is highly interactive: one can use VR controls or a gamepad, grab characters and move them around, plant more giant flowers, and help balance the planet, while the reinforcement learning-based agents can attempt to seek rewards.
Though the movie wasn’t a box office hit, it seeds an opportunity to be the first to discover a new cinematic technique in a new filmmaking medium. There are few more notable VR-based movies like Gabo Arora and Chris Milk’s virtual reality film documentary Waves of Grace, Henry from Oculus Story Studio. When powered by AI, film making in VR will surely take a momentous leap. Even now, neural network is contributing to creation of CGI animated characters in movies, while IBM’s Watson is employing interactive voice recognition to help with voice commands in VR world.