How VR Was Used In A Game Of Thrones' Scene

Category: 
How VR Was Used In A Game Of Thrones' Scene
13 Juillet, 2019

In the Game of Thrones series finale, Jon Snow took it upon himself to take Daenerys Targaryen’s life, thus ending her short-lived reign as Queen of Westeros. He did it because she had destroyed the city of King’s Landing, and he felt she would eventually harm those he loved if she wasn’t stopped. Love it or hate it, the scene was powerful, and that was before Drogon climbed in the room.

 

Sensing his mother was dead, Drogon Daenerys’ lifeless body. Instead of taking his anger out on Jon, the dragon chose to destroy the Iron Throne, the source of his family’s problems since they’d arrived in Westeros. Under the relentless heat of his breath, Aegon the Conqueror’s throne was turned into slag.

 

The scene was put together using a digital dragon, real-life actors, and fire that split the difference. Speaking with Befores & Afters, members of the Game of Thrones production department explained how it all came to be.

“This was the first time we had used a VR setup for the show,” said VFX producer Steve Kullback. “The Third Floor created a set piece based on the model provided by the art department, and had it functioning in a real-time engine so that you could put the goggles on and then, almost with a laser pointer device through the VR, you could point to any part in the set, and click the button, and be there, and then turn around, and set your camera position, and then those could be saved.” Very fancy.

 

VFX supervisor Joe Bauer explained how showrunners David Benioff and Dan Weiss, who directed “The Iron Throne,” were able to create a “mini-library of camera views” for the scene. “So by the time we took it over and started working with Pixomondo, who was the lead dragon animator, and Scanline, which did the throne melting, there’d already been a lot of R&D for the look and the angles and the visual approach.”

 

Adam Kiriloff, a senior real-time technical artist at The Third Floor (the pre-visualization, post-visualization, and virtual reality company that helped make this scene happen), spent four months in Belfast, Northern Ireland, where he created the 3D model for the scene. “Assisted by an asset builder to speed up the 3D modeling process, I would receive SketchUp models from concept artists, hi-res models from the visual effects team, elevations from the architects and even hand-drawn sketches from set decorators. First we would ingest the digital assets, matching real-world scale and doing a first optimization pass.”

 

Once the digital 3D assets were homogenized, I would ingest them into Unreal Engine. From there, I would create environment materials in Substance Designer, paint bespoke props in Substance Painter, develop particle effects and set up lighting and atmospheric effects like fog and ash in Unreal Engine. In a few instances, I used photogrammetry to capture props like the Iron Throne and the mural in the throne room.

 

I’m not going to pretend to understand everything he said there, but I respect the education and work experience it surely took to say it.

Because Kiriloff and his team needed to “turn around complete sets in a matter of days,” they kept the “models, textures and lighting to a basic quality level.” They also created and used a virtual camera called Pathfinder to plan their shots inside the virtual environment. “The virtual lens mimics real-life camera settings. Lens configurations can be set in advance based on the director of photography’s request and lens swapping in VR is as easy as scrolling through your lens selection.”

 

Game of Thrones cinematographer Jonathan Freeman then used the Pathfinder to create photoboards, which are basically virtual reality storyboards. After that, the directors and the art department got together for a VR walkthrough.

 

“Group sessions often took place where one or more of the directors individually use the VR system, and the art department would gather around a large spectator TV we had set up and discuss what the director was planning visually,” Kiriloff explained. “Our scout tool has 3D annotation tools, a laser pointer and several different measuring devices, which complement and help facilitate group discussions.”

When the Pathfinder VR tool couldn’t capture the complexity of some shots, animated virtual cameras were used. “I was able to work closely with Jonathan [Freeman] throughout as the virtual camera shots were planned,” said Kiriloff. “The throne room scene ultimately had over 100 virtual cameras from various camera angles Jonathan was testing. Jonathan is a true perfectionist and left no potential shot positions unexplored.”

 

Michelle Blok, pre-visualization supervisor at The Third Floor, described the difficulty of fitting an enormous dragon in the same shot as Jon and the Iron Throne. “One big challenge of this sequence was the inclusion of Drogon into the very confined space of the throne room. The set, or more accurately, the set destruction, needed to be designed based on the area the dragon would take up and how much space he needed to enter and exit the room.”

 

Once all the shots were completed in previs, we stitched all the dragon animation together, creating a continuous animation that lasted for the entire length of the scene. This ensured not only that the action was consistent from shot to shot but that the dragon’s placement in the scene was also consistent.

 

It’s baffling to think how much work went into all of this, but it paid off in the end:

Kullback and Baur used a simulcam (a real and virtual camera hybrid that can superimpose actors onto a virtual simulation at the same time) to have Harington and Clarke interact with the CG-rendered Drogon. “We pre-animated the dragons as we often do, but this time we brought it into simulcam so that we could use Ncam – a real-time 3D motion-tracking device – so that the camera operators could have the CG character, Drogon, visible to them,” Kullback explained. “And we could all see how the camera angles would move in concert with the movement of the dragon, and they could follow.”

 

“The simulcam was great for this because it was such an intimate scene,” added Baur. “We had previs, and had worked out, over a number of iterations, the angles. But with simulcam, the camera operators were able to add movements, or Jonathan was able to adjust movements or reframe, having all characters at the scene represented for blocking.”

 

Traditionally, if you don’t have anything representing the Dragon, then the operator, left with nothing else, will frame up the person that they do have. And then you may discover that you don’t have the space for the dragon or you’ve made a bad composition. So this really allowed everyone to react in real-time with the subtleties of performance, but that framed both characters correctly.

Casey Schatz, virtual production/motion control supervisor at The Third Floor, was responsible for making sure Kit Harington’s interaction with the orange ball on a stick that represented Drogon looked convincing. “Given what was meant to happen in the scene, we knew that regardless of how good the animation and rendering was of the dragon, none of it would matter if the photographed footage didn’t show a true connection between the creature, which was of course CG, and the real actor, Kit Harrington, playing Jon Snow,” Schatz said. “It was a moment of Jon and Drogon really looking each other in the eye, so looking in the right direction across the length of the action was vital. We needed a way to have the two characters ‘be’ in the same room together, in the same physical space.”

 

Schatz’s team used an Ncam (a tracking system that simultaneously records a camera’s position and orientation in real-time with VFX inserted into the viewfinder to help visualize scenes) to get just the right blend of real (Harington and Clarke) and VFX (Drogon). “I would give the operator a start, middle and end mark and the timing in between came from the dragon animation,” Schatz recalled. “When he saw the dragon move, that was his cue to move. It was a really flexible approach that gave Dan and David who were directing this episode, the freedom to pause and say, ‘Ok, Kit, you’re scared. Step back a little or lean more this way.’”

Bauer talked about Drogon’s arrival into the throne room and how it was important to set the mood just right. “Drogon doesn’t charge into the scene like a mad bull,” he said. “The idea was, because it was going to be snowing, and there would be a depth of snow buildup into the distance, that he would sort of appear like almost like a mirage, and then come forward and resolved, both in focus and visibility to the snow, but not overtake the scene, you know, when he shows up because the big act has just happened. So we had to add him without stomping on the moment.”

 

Interesting that he mentions snow and snowing rather than ash. I guess we’ll chock that up to misspeaking.

 

And how does Bauer explain Drogon’s emotions toward his dead mother? He’s got feelings, you know. “And then, it was a lot of direct guidance from Dan and David about, well, how surprised is he meant to be? How anthropomorphic is he? What are his emotional beats? Generally, we will be pretty subtle with the emotions of the dragons, and lean more towards the lizardness. So, you know, it might be 80% lizard, and 20% readable emotion. We got a bit more emotion, obviously, into this moment. But we kind of went back and forth a few times, as far as anger versus him just trying to understand what has happened.”

 

What’s so nice about the sequence is when he shows up, he doesn’t know what’s happened, but he knows it’s not good, which is why he nudges Daenerys, comes to a conclusion, and you think that he’s just going to chomp Jon. But it’s more of just a mournful moment than striking out in anger.

 

Bauer also notes that Drogon’s fire in the shot where the throne is melted was added after the fact, if you had any doubt. “We shot some fire as a reference, but there wasn’t anything that was used on-camera fire-wise. There were some pipes of gas in around the destroyed room, but that again was really just for reference.”

Mohsen Mousavi, visual effects supervisor at Scanline VFX, discussed actually filming the melting throne bit. “We started on the burning throne shots in March 2018, when we did some tests. Originally they wanted it to be one continuous event, but editorially it was sort of impossible, so we ended up doing a simulation for every shot from scratch to get that look and the mood and the flow of the shot going properly.”

After burning the Iron Throne, Drogon gently lifted Daenerys in his massive claws and flew away to parts unknown. Bauer described how the production team was able to make the scene look real and give it the proper emotional heft it needed. “The trick there was figuring out what part of Drogon’s giant claw would grab her,” he explained. “First of all, how would he get anything underneath her without digging into the floor? And second, what is her body position as she’s being lifted up, so that it makes sense with the toe? So we worked with Pixomondo, and then made the configurations by using Emilia’s body scan.”

 

Kullback chimed in with praise for how Benioff and Weiss chose to film Dany and Drogon’s final scene. “Ultimately, the considerations for the performance that were important to Dan and David were that they wanted it to be very, very gentle when she was picked up,” he noted. “She’s almost beautiful in the way that she’s lifted up and cared for. So these were considerations that were important to them. And it wasn’t about a gimmick, it wasn’t about the performance of the dragon taking centre stage, but just to help lift her out in that way.”

A voir aussi

VRrOOm Wechat