Virtual reality (VR) is the latest "cool" technology and you can now choose from a plethora of virtual reality headsets and systems to play with. However, it is still in its infancy and the technology is not yet truly immersive.
You may have heard mention of 6DoF, which stands for "six degrees of freedom". This refers to how much freedom of movement a rigid body has in a three-dimensional (3D) space, so if you were to imagine the X, Y and Z axes, ideally you would need to have the ability to go forward or backward (known as "surge"), up or down (known as "heave"), or left and right (known as "sway").
6DoF is of especial interest to everyone in the burgeoning VR industry, because at the moment there is still a limit to what you can achieve with monoscopic 360 degree videos – the images still appear flat.
VR experiences have the 6DoF problem sorted because everything is computer animated, so the VR headset is able to track the scale of the room to understand exactly where the player is in the room.
If you know exactly where the player is, then you can make calculations so that motions made by the user, such as moving backwards or forwards can be translated into movements within the game.
Light field cameras and holograms
But if you want to do this with a human in a recorded video, such as Björk's 360 degree music video for Stonemilker, then at the moment it's only possible with extremely expensive image capture technology.
The startup Lytro uses cameras that work by capturing the geometry of light to make live action 360-degree scenes possible. However, their light field cameras cost hundreds of thousands of dollars and can only be afforded by high-end Hollywood blockbuster film products.
There's also New Zealand VR technology firm 8i, which is using creating super realistic 3D hologram videos using computer algorithms and a green screen studio containing 41 cameras, which it calls "volumetric capture".
8i wants its technology to be used in films, as well as in a consumer smartphone app it is developing, where you can place pre-made holograms in a room and then examine them in your home through the phone's camera lens, as well as take a selfie with the hologram – a sort of blend between virtual reality and augmented reality.
Facebook's Surround 360 cameras
Then there's Facebook, which announced two new Surround 360 cameras on 19 April at the F8 conference that try to replicate the 6DoF effect with six or 24 cameras arranged in an orb formation.
The forward-looking infrared (FLIR) cameras capture what is going on and then Facebook's proprietary software computes for every single pixel where it is located in the scene to estimate depth. Once each pixel has an exact location, users can view the video from any angle and perspective, almost as if it is a real-life scene, and the video will look 3D.
Facebook does not intend to sell its cameras as products, but rather to licence the camera designs so that content producers and developers can create more engaging videos that feature a mix of live action and computer graphics
Adobe's 6DoF breakthrough using just one 360-degree camera
Multimedia creativity software firm Adobe has also been working on solving the 6DoF problem, and its scientists believe they have managed it without needing a lot of expensive technology – just one single 360-degree camera.
In an open access paper entitled 6-DOF VR Videos with a Single 360-Camera, the researchers describe how they developed a warping algorithm that is able to figure out how monoscopic 360-degree video footage would look if it was stereoscopic, by inferring what the camera path and 3D scene geometry should look like and removing any possible distortions.
The result is a monoscopic 360-degree video that can be played in VR headsets, where it turns into a stereoscopic video that allows for 6DoF at more than 120fps.
The algorithm can also be used to stabilise 360-degree video or to create different versions of the video to suit viewers with varying degrees of motion comfort levels, but it can only be used if the camera that shot the original footage was moving at the time when of filming.
"We assume the camera moves to infer the depth," Adobe's head of research Gavin Miller toldVariety. "If the camera rotates but does not move side to side we cannot compute depth but can stabilise the rotation."