Director Art Haynie On Shooting Concerts In 360°

Category: 
Director Art Haynie On Shooting Concerts In 360°
December 5, 2016

Director-producer Art Haynie jumped into immersive video technology with a concert film that preserves for 360-degree posterity the last U.S. date played this year by Eagles of Death Metal. The gig was part of the Lost Highway motorcycle show and concert at the San Manuel Ampitheatre in San Bernardino, CA — essentially home turf for the band, which is based out of Palm Desert — and it's expected to be available soon for streaming online and via apps. The project posed new challenges that took him back to the basics in both production and post. "It's made me better at all of my work," he told StudioDaily. "I don't skip questions any more. I go back to asking those and thinking through problems more thoroughly. It's a really good mental exercise." We asked him about the principles he embraced on the shoot and what he learned from the experience. 

 

Keep the Camera Stationary. Haynie knew this wouldn't be a typical concert film, with roving cameras aiming to capture the band from just the right angle. "You need to plant the camera [in VR], because when it moves it makes people seasick," he says. "If you're driving, going up a twisty road doesn't bother you. As you turn the wheel, your brain knows what it's going to feel like, so it talks your stomach off the ledge. But if you're in the passenger seat, you don't have that communication between your body and your brain. In VR, it's a similar experience — if the visuals say you're moving, but your body's not, it's freaky.

Above picture: Director Art Haynie

 

Take People Where They've Never Been Before. When considering the perspective for the 360-degree camera, Haynie knew that it wouldn't be particularly interesting to try to put viewers in the audience. "What makes a good director is taking people places and telling stories," he notes. "In the bits of concerts I had seen, it was like 2D concepts had been jammed into VR. I thought, 'I don't want to be in the crowd. I can be in the crowd at any show, in person. I'm taking people somewhere I can't go. ' So I put the camera right in front of the singer's mic. You can practically take guitar lessons from him. You can see what pedals he's pushing and read his setlist."

 

Know Your Rigs and Where to Put Them. "The best [360-degree] cameras are still GoPros, but I don't like the factory rigs," Haynie says. "There can be heating issues and I don't like where their stitch lines fall, so a friend of mine built a custom rig with multiple configurations, as four- or six-camera set-ups. So the front camera was a custom-built rig and the back set-up was a pair of Kodak cameras — which wasn't the best solution, but the shoot came together rather fast. We practiced putting everything together, setting up, and getting running. I set up two rigs, both with multiple cameras, and assembled what I could before the show behind the stage. When the first band went off, here comes the fire drill. I had measured Jesse, the singer, by height to his eyeline. And we put the second rig at approximately the drummer's eyeline at the back of the stage.

 

Don't Be Afraid to Cut. "People say they want immersive video to be interactive, but you can't stand there and spin around for longer than a minute or so," Haynie says. "So I edit just like you would in a normal production and feed the audience the best takes. If they want to look around, that's great. But they don't have to. Once you settle in and realize you're being fed the best angles, then it's a much better experience. But we also realized that you can't make really quick cuts. Some of our cuts were only made because there were massive stitch errors that would have cost a ton to fix. That's another reason to do multiple rigs."

 

VR Concert Audio Is More Than Music. Haynie remembers discussions with Margarita Mix Senior Technical Engineer Pat Stolz and re-recording mixer Nathan Dubin in Santa Monica, where he came to the conclusion — along with the experienced audio pros — that the concert would be primarily a aural experience, and that using head-tracking (to shift the sound field as viewers turned their heads) would create a distraction from the audio. As it turns out, that was the wrong approach. "We thought, 'This is a music experience, so we don't want the whole world to turn when I turn my head because it will sound terrible,'" he says. "But in the studio, cutting with the goggles on, it wasn't a music experience. It was an immersive experience, and music was one component of it. And we did a complete 180 on what we had planned. I wanted more interactivity. I wanted it to feel like I was there."

 

Haynie said the optimal approach isn't always apparent, especially when the best, most effective approaches to the medium are still being discovered. From this project, he learned that it makes sense to reconsider conventional wisdom about how a program should look and sound when you're making a more immersive experience.

 

"Nathan had done multiple mixes with varying levels of interactivity built into them, and he said he had shown them around the office to pro audio mixers, and they didn't like the really interactive ones. They said, 'It really wrecks the music.' " Haynie recalls. "But they're guys who want the utmost musical experience, and that's not what this is. There's so much more to it, and it hasn't been done before. We're figuring it out and setting precedents. That's really fun to me."

 

Read more below for more perspective from Margarita Mix's Pat Stoltz on mixing audio for 360-degree video and VR:

 

Capturing Audio Invisibly, Using Sound to Guide the Visual Experience, and Dealing with Multiple Delivery Formats.

 

FotoKem company Margarita Mix has added 360-degree sound-mixing capabilities to its slate of audio post services in Santa Monica and Hollywood. What does that mean? Right now it involves bringing a lot of know-how to bear on an emerging post-production challenge — mixing sound for a spherical environment that can only be properly experienced while wearing a head-mounted device (HMD). Different audio formats and varying creative approaches to the VR experience multiply the possibilities. We asked Senior Technical Engineer Pat Stoltz, who recently worked with director Art Haynie of Big Monkey Films on an Eagles of Death Metal concert film that used the Dolby Atmos VR audio format, about emerging best practices in VR audio workflow. 

 

Think Carefully About Audio Capture. It's easy to forget that shooting 360-degree experiences means you can no longer hide your crew and your equipment just outside the frame. In a 360-degree video, there is no frame. "You can't have a boom operator there in the room because he's going to be seen," Stoltz says. "Everything is seen in VR. So you have to hide the cables, and you have to hide the microphones, and you have to hide the people who are on set directing and recording. If the audio is left to be recorded by the on-board microphones on the cameras, 90 percent of the time that will be completely unusable. So there needs to be greater attention to putting wireless microphones on the actors and acquiring a good quality recording on set. You can't have extra cameras everywhere, cable runs have to be hidden or wireless, and any people supporting the production, from the scriptwriter to the sound mixer, have to be hidden from view."

 

Audio Cues Can Be as Important as Visual Cues. "Audio has to be thought of during the scripting of the piece that you're working on," Stoltz advises. "You have to use it as part of the whole scene. You're going to want to draw the gaze of the viewer to particular areas of the 360-degree environment. How do you do that? You can do it through visual stepping stones that take you over to a specific area, or you can do it via audio cues. You hear something behind you, so you turn to see what you just heard, and then the visual cue kicks in. So you have to think about audio as a very creative aspect of your production."

 

Audio Workflow for VR Is Evolving. Stolz says Margarita Mix works primarily with an equirectangular picture file, which is a flattened representation of a 360-degree video. That is projected on-screen with a superimposed grid with reference markers indicating where 0-, 90-, and 180-degree points would fall in the spherical space. "You'll place your objects in the mix with that in mind," Stolz says. "Unfortunately, you can't mix while wearing an HMD because you can't see what you're doing. So you place things in software based on quadrants [of the sphere], or if you have an actor there, you can just place their dialog directly over the performer. And once you get that mixed in and the levels all set, you go to a different computer that's locked to Pro Tools or whatever DAW you're using, put on the goggles, and view it and dial it in there. It's a cumbersome, back-and-forth workflow, but it's coming along." Stolz imagines a future where mixers wear augmented-reality goggles, like Microsoft's still-in-development HoloLens, that would allow them to see the spherical content and their mixing console simultaneously. 

 

VR Deliverables Are … Complicated. The Ambisonic B-format — a representation of a 360-degree sound field — can be delivered in one of two streaming standards: FuMa or AmbiX. (The difference between the two is the order of the four channels of the audio mix: AmbiX uses WYZX while FuMa uses WXYZ.) Earlier this year, Facebook acquired Two Big Ears, which uses a proprietary .tbe file that requires an embedded decoder for audio metadata on the playback computer or device, as does Dolby Atmos VR. And a new format from the German government-backed Fraunhofer Institute, Cingo, is on the horizon. The multiplicity of standards is not ideal, but Stoltz hopes it will get simpler. "For Jaunt, you're going to need Dolby Atmos. For Facebook, you're going to need a .tbe file. And for YouTube or Vimeo and other streaming services, you're going to need a FuMa or AmbiX file," he explains. "Currently, you have to wrap those individually [in separate MP4 containers]. For the future, I'm hoping you can wrap all four of those into that original MP4 and deliver that, and whatever software you have will automatically read the correct file. But right now we have to specifically wrap for whatever format we're mixing for. "

 

Head-Tracking or Non-Head-Tracking? One of the benefits of Dolby Atmos VR, as well as the new Fraunhofer format, is support for what's known as higher-order Ambisonics, which includes the ability to specify, via metadata, that certain sounds should track along with the viewer's head movements while others should remain stationary in the mix. "Let's say you're snowboarding down a mountain, wearing headphones with light music playing," Stolz suggests. "When you rotate your head, you don't want the music to rotate and change orientation. You want it to stay in your ears. But other skiers going by you, or people talking? When you turn your head, you want that orientation to change. That's the difference between head-tracking and non-head-tracking. But if you're delivering a FuMa or AmbiX file, you don't get the non-head-tracking option. Everything is head-tracking, all the time."

 

If audio for VR remains complex, it's the job of experts (like pro sound mixers) to simplify — to cut through the white noise of all the various formats and options and clearly define the creative options that are open to the filmmakers they collaborate with. Stolz says it's worth the effort. "It really takes the creative embellishing of a visual piece to the next level," he says. "When you, as a viewer, put those goggles on, you are the director. No two people are going to have the same experience. You're always going to be looking somewhere else, and hearing different things. Creatively, the director and mixer have to focus on what they want their viewer to experience, and how to guide them to what they want them to focus on at any given time in the piece.

 

"Before, you had editors that cut the visual piece, and the audio just drew you in. You sit back and watch it as a third-person. But when you put goggles on for VR, that is your reality. And audio plays an important role in representing the reality you are in."

Related articles

VRrOOm Wechat