Stereoscopic virtual reality technology lets fans experience live sports like never before.
It used to be enough to listen to the baseball game on the radio, but then television came along and brought the audience into the game like never before. Color, graphics and digital technologies like the superimposed first-down yard line or moving swim lane leader marks came along, making it easier to see the action unfold.
Once these new technologies came along, it became hard to imagine sports life without them.
“The digitization of sports is completely changing how fans experience sports,” said Jeff Hopper, head of strategy & marketing for the Intel Sport Group (ISG). “Now you have a completely personalized way to engage with your sport in any way you want, how you want it, when you want it – from your own perspective.”
Historically, the footage spectators can watch from a live event is limited to whatever broadcasters chose to air, which is further limited by the camera operators’ angle. New VR technology by Voke VR, recently acquired by Intel, is immersing fans into basketball and football games, allowing them to see the game from any nearly angle on the field.
“The linear broadcast model is deteriorating,” said Hopper, adding that spectators are ready to cut the cords from their cable packages. “The sports world and broadcast networks are starting to look at these new solutions, to figure out how to give fans control over how and when they watch the game.”
Hopper said live sports is the last bastion of mediums where the fans can’t control the experience, but that’s changing. He points to the way people consume music and movies. Pandora, Spotify and iTunes, for example, let us personalize our music experience. Netflix and Amazon Video let us watch what we want when we want it.
“Now when you’re at home and you want to engage in the game as if you were there, you can,” said Hopper. “What you’re going to have on your tablet or your PC or your phone is going to give you the ability to dynamically go into the game and see view points from any angle you want.”
How Does Voke VR Work?
Dr. Sankar (Jay) Jayaram, a co-founder of Voke and diehard Seattle Seahawks fan, was annoyed at missing a game when the idea struck him – what if you could use VR to put people in the stadium to watch the game? What if you could bring a live VR experience to spectators anywhere in the world? That was in 2007 when the headsets of today didn’t even exist. He had been working with a team of others to study using VR for museums and exercise bikes.
Dr. Uma Jayaram, co-founder of Voke VR.
Jayaram had long been thinking about VR. He and his wife Dr. Uma Jayaram, Voke’s co-founder and COO, ran the Virtual Reality and Computer Integrated Manufacturing Lab at Washington State University, where they were both professors.
One of the earliest VR experiences they created required a person to lean into a toy Viewmaster glued to a computer screen. It earned them funding to build more VR technologies, starting with a prototype of a headset that weighed about six pounds and had a giant beak to house the electronics.
Dr. Sankar (Jay) Jayaram holds his first VR headset prototype. It weighed six pound and was, he said, “enormous.”
These foreshadowed what would become Voke VR, a company the Jayarams co-founded with a team in 2004. With just 15 people, they grew it into a company poised to change live experiences.
“People told us live VR could never be done,” Jayaram said. People couldn’t imagine how we could process all the data in real time.
“But my team knows better. We all know that once we set our minds to something we can make it work.”
For its first attempt, the team set up a huge rack of computers and cameras at an NBA game in 2010.
“It took us a month and a half to process the data to produce a five-minute playback,” said Jayaram.
They also had the problem of space – the impracticality of having racks of computers courtside became clear – so they shrank the hardware required to livestream in VR. Within two years and with a lot of tinkering, they were able to process events in real time with a computer system small enough to fit under a table.
“It took a lot of hard work from our engineers and a lot of thinking through the process and figuring out how we can maximize the use of computing and graphics and CPU, and GPU, and algorithms,” he said.
The Voke team looks at the live feed during Tampa Bay Buccaneers vs New Orleans Saints at Raymond James Stadium in Tampa, Florida on Dec. 11, 2016. ©2016 Scott A. Miller
Each Voke camera rig is typically loaded with 12 (or more) cameras that capture a stereoscopic view – capturing 180 to 360 degrees of action, but also depth, so viewers feel like they’re at the event.
By wearing a VR headset and connecting via the Voke app, spectators can watch a touchdown from several angles, or turn their heads and see who’s throwing peanuts or spilling beer in the stands. Those without a VR headset can see panoramic 360-degree views from their tablet, phone or PC.
A Voke camera rig captures the action as Tampa Bay Buccaneers play New Orleans Saints at Raymond James Stadium in Tampa, Florida on Dec. 11, 2016. ©2016 Scott A. Miller
To make the experience possible, the live footage captured by the cameras – about 48 for an NFL game, for example – is color calibrated, stitched together and rendered into a digital version in real-time. This requires massive processing power. Powerful Intel Xeon processors move the data from the field to the viewer in real time.