The Neotopy includes storytellers, filmmakers and sound engineers who specialize in immersive VR/360 cinematic experiences. Their skills and talents make one of their most recent projects, "Expedition Antarctica", a breathtaking VR/360 experience of the nature and wildlife of Antarctica. Alexandre Regeffe recently discussed the development of the VR project and how Neotopy managed the VR/360 post-production process.
Q: Tell us about the "Expedition Antarctica" and its environment - what is the basic premise of "Expedition Antarctica"?
A: "Expedition Antarctica" is the first return to Antarctica by famous French director Luc Jacquet, who directed the documentary "La Marche de l'Empereur" in 2006, which won the Oscar at the Academy Award in 2006. https://en.wikipedia.org/wiki/March_of_the_Penguins and https://www.youtube.com/watch?v=L7tWNwhSocE . Luc wanted to return to Antarctica to see what had changed in the years following his first documentary. So he and his filmmaking team arranged this expedition to do a second documentary about the wildlife in Antarctica. Just before going, they decided to bring along a 360° camera rig to do some shots.
Q: Who is the client?
A: The main producer is Wild Touch (http://www.wild-touch.org/), an ecological organization that organizes expeditions around the world to testify about the wildlife on earth, and the impact humans have on nature. They asked the European broadcaster Arte (http://www.arte.tv/fr), which has a 360/VR platform called Arte 360, if they wanted to produce VR content for the project, and the 360 adventure begins.
Q: How did you get involved?
A: There are several co-producers on this project. One of them, Paprika Films (http://www.paprikafilms.fr/) works with me on several TV projects. One day the producer said to me, "I've just returned from Antarctica with VR footage. Now, how can I post produce it?" Well, he was speaking to the right person! After a few discussions and showing Neotopy's skills, we got involved as a co-producer on this project. It's a co-production between Arte France, Paprika Films, Wild Touch Productions, Andromede Oceanology, Kolor, and Neotopy.
Q: What post tasks did you do, e.g. conform, color grading, finishing, compositing, VFX?
A: We did all of the post-production process - conform, color grading, finishing, creating and compositing of VFX. We also worked as advisors a lot for the storytelling because we are creators, not only technicians.
Q: Who was the DP? And how did the shoot occur?
A: There wasn't a DP on set as the project came together very quickly. When the team decided to shoot in 360, they didn't have the particular knowledge or training about cinematic 360 or cameras. Kolor provided them with the GoPro rigs and they thought, "let's shoot!" You can imagine how epic this shoot was. Despite a lack of expertise, the team shot very good scenes in this wild nature environment.
Q: Who was the director?
A: Two directors worked on the cinematic 360: Luc Jaquet himself and Jeanne Guillot.
Q: Which cameras were used in the filming?
A: The team used Freedom 360 rigs, and Abyss rigs for underwater scenes - both GoPro rigs. Unfortunately because of the extremely cold weather conditions, the hardware failed sometimes.
Q: How was the stitching done? In what format did you receive the stitched material?
A: We received folders with MP4 files from individual cameras of the rigs. We used Kolor Autopano Video to do the stitching. Some shots were difficult to stitch well because the rigs weren't properly established on set. Seam lines were very visible, so we had to handle this with rotoscoping and tracking in SCRATCH VR. There are a few scenes where the stitch is not perfect, but we had budget and time limitations.
Q: What was your workflow and in general, what tools did you use?
A: The expedition team came back with a lot of shots. First, we provided content expertise, and decided with the directors which shots would be usable and which not. They experienced extreme weather conditions during the shooting so the camera rigs sometimes failed to record properly.
Then we did a pre-stitch in low res so that we could do the editing process on Adobe Premiere with Mettle plug-ins. Once edited, we did the final stitch and processed the stitched shots in 4K ProRes (4096x2048) at 60fps, which is the shooting frame-rate we wanted to preserve. Then we went on ASSIMILATE's SCRATCH VR to do the color grading, shot by shot.
The final step was to de-noise, add some sharpness, and create some visual effects to hide rigs (but not the nadir tripods, which they want to keep). For example, in the helicopter shot in the first chapter, the steel barrel didn't exist on the shot; it was made in post to properly hide the rig attachment. We also created the opening scene and end credits. We handled all the sound part of the films, such as sound editing and spatial mix.
Q: What hardware/ software is the bare necessity for finishing in VR?
A: To post produce VR (cinematic 360) you have to use huge amounts of GPU and CPU power. You're working with large-definition frames, high-frame rates, and you want to work in real-time! For example, we built several workstations with multiple Titan X Pascal GPUs in order to render the final stitching as fast as possible - like some optical flow stitches with the OZO camera that are very time consuming at the rendering point.
On the software side we chose Adobe Premiere and After Effects to do the editing and FX work, combined with Mettle Skybox plug-ins. With these tools, we were able to decide how to rotate the whole sphere, in real time, with real-time views in a headset (Oculus Rift or HTC Vive). For the color grading and finishing we rely on SCRATCH VR because if offers advanced VR tools that are streamlined into a single workflow. For example, the masks automatically repeat on the right or left side to the equirectangular image. SCRATCH VR also makes it easy to manage the highlights and lowlights within the material, drawing out the depth of shadows and softening the glares.
Q: For the deliverables, the differences in the VR headsets and browsers must be a nightmare - how do you compensate grading and finishing for each?
A: It's a big problem. Lots of different devices, screens -- you have to constantly check the devices to see if you get the same image as on your grade 1 monitor. And it's not. We choose to grade and master in the DCI P3 color space, DPX and sometimes in ProRes BT 2020. In my opinion, the headset screens have to be compliant with the BT 2020 color space -- HDR, UHD TV -- to provide life-like colors and contrasts, which the look and feel needs for a good immersive experience.
Q: What advantages did you get in using SCRATCH VR for your post of VR/360 content?
A: The advantage of SCRATCH VR is in the advanced and specific VR features. We can use XML files from Adobe Premiere, and conform this XML with the final stitched material just in one click. This way we can manage each shot in the right length and with the transitions we chose in Premiere. And we can add titles directly on the shots, in 360 mode. The playout on Oculus Rift is also a big advantage for grading, to see the result immediately and then modify in real time. It's very important to our business model to have SCRATCH VR as our primary VR tool.
Q: What challenges did you face in doing the color grade/finishing for "Expedition Antarctica"?
A: The primary difficulty was the color disparity between cameras that are on the same rigs. Also, when you grade an icy or snowy environment, the tint differences are immediately visible, like color spots. So, we had to grade by zones and then unify all zones to get a good image of the landscape. De-noising was also a challenge on H264 captured images (the GoPro sources) because you have to deal with banding when you push the grade.
Q: In comparison to "normal" footage, how much extra-time was necessary for this project?
A: I think you can safely say that we doubled the time compared with "flat" footage. And it's far more if you work in stereoscopic 360!
Q: Where and how does the 360/VR workflow deviate from the normal workflow?
A: Cinematic 360 is very specific image projection. You work in a sphere, so you need tools like SCRATCH VR that are able to deal with equirectangular (latlong) images by distorting 2D titles and graphics, for example. Today SCRATCH VR handles all these issues pretty well!
Q: Did you use special tools and plug-ins / add-ons for this project?
A: We used plug-ins like Skybox from Mettle; they are designed to work in a sphere by creating a virtual rig in After Effects. They also provide contextual blur and sharpen effects, and many other things. We also use Neat Video as an open FX plug-in in SCRATCH VR to deal with noise.
Q: If someone wants to get into 360 finishing, what challenges are waiting for them?
A: Underestimating render times! You'll have to test your decisions all the time, and go back-and-forth between steps, so you need to have a flexible pipeline.
Q: If you could start the project all over again, would you do anything differently?
A: Being involved as soon as possible in the project, and especially in all cinematic 360 projects is essential. You can speak to the DP and director to give them key advice for shooting, such as camera positions to minimize parallax and avoid seams. You can tell the producer your experience about different camera rigs, and help to choose the best solutions for each shot.
Q: What are the best VR examples you have seen to date?
A: Of course Neotopy's projects come to mind, but today there are thousands of 360 videos available. The key is not technical achievement or "world's first"; for me the key is storytelling. It's all about the stories, the ideas. People should watch "Invisible", a drama VR series. I love the National Geographic videos because they show the beauty of nature in 360. VR journalism is often very good to discover, like the New York Times VR experiences. You need to discover and experience VR yourself - it's so easy to get a headset and begin exploring.
See "Expedition Antartica" at http://future.arte.tv/fr/antarctica/expedition-antarctica-360deg