SUMMARY_20

SUMMARY_20

 

By Kate Bulkley

The fourth edition of PICTURE THIS_20 was delivered as a completely online event. Initiated and funded by Nordisk Film Fonden, PICTURE THIS_20 focus was how quickly virtual production tools are evolving, with the Covid 19 pandemic also helping accelerate implementation. Pre-production and production are coming ever closer together with virtual production (VP) technology, a development that impacts the production process as well as traditional production roles. The move to ever more real-time VP film making is offering new collaborative opportunities for filmmakers as well. And while it’s a transition fueled by technologies like game engines and LED screens, the incorporation of virtual production tools is not necessarily only about cost-savings and production speed; the new tech is also offering new narrative opportunities for storytellers.

As keynote speaker Sam Nicholson, founder and CEO of Stargate Productions, warned the audience, these new tools must be mastered but they also need to be embraced by everyone on set in order to achieve their full potential. “You can master the technology, but to be effective, the mindset of the producers and DPs (Directors of Photography) has to go along with it,” said Nicholson.

The conference opened with an ‘Introductory Guide’ to virtual production tools from Drew Diamond, Head of Virtual Production at Oslo, Norway-based Pixotope, which works on virtual production for broadcasts, like adding CG crowds for live televised baseball. Diamond began his career in the US and has worked on a variety of productions where he has learned that virtual production (VP) spans a spectrum including computer-aided production and visualization methods. VP, he says, is where the digital and physical worlds meet.

Modern graphics and game engine technologies can mix real-time interactions on set with computer graphics and environments that have helped films like award-winning film Parasite and HBO blockbuster series Game of Thrones. Virtual set scouting speeds decision-making and virtual models can be created to “see” the set before anything is built which can save money! Testing different lenses and simulating different angles and lighting can aid the director and cinematographers to plan angles and shots before actual shooting begins.

Using Unity and Unreal Engine game engines in this way changes the production pipeline; VFX artists can see in real-time on set what might have been relegated to post-production. The phrase “We’ll fix it in post” can go away entirely.

Diamond’s ‘best practice’ for these VP tools is to “test, test and test” because there is still a lot to learn. VP is not just within the grasp of well-financed Hollywood films and VFX companies and it is not simply about fancy new visuals or for the producers to have a cheaper or more productive workflow, says Diamond. VP is especially for the director to let him or her have more agency over their vision of the film. And because of Covid, there is an advantage to leveraging remote tools because every person that is on set is a liability right now, says Diamond.

Watch SESSION_1:

In Session 2, the three creators of Love and 50 Megatons gave a case study of how this student graduation film completed on a minimal budget of EURO 45,000 for the Film Academy Baden-Württemberg was able to share the nominee spotlight with Disney’s The Mandalorian at the Visual Effects Society Awards.

Using the Unity game engine, 272 shots were completed in less than three months. Filmmakers Josephine Ross (VFX and live-action producer), Denis Krez (VFX supervisor and compositor) and Paulo Scatena (technical director) told us that VFX is no longer “a parasite on the set but is an integral part of the project.” However, they reminded us: “don’t think that VP means less worry on set, it just shifts the stress!”

The film was made using miniature paper models, scanned using photogrammetry and then brought into Unity. The trio’s key learnings were that previsualization needs to be used early on in the project so things can be nailed down early, especially when there is a tight timeline. Also, they agreed that you need to have a game engine expert – in their case Unity – involved from the start because it is a different process entirely. They had to be very persuasive with suppliers to get access to certain technology like high-performance 4K projectors. The main learning was that on set VP means a more real-time collaborative process which has its own stresses but also has great advantages. “Every production has to think first about your story and then figure out how to make it work with the technology which is as it should be because no one is going to go to the cinema and say look at all the great technology,” says Krez. In Berlin a new film fund for VFX is going to help bring VP more into the mainstream but greenscreens are still more flexible, said Krez, “even The Mandalorian fixed a lot in post.”

Watch SESSION_2:

The Third Session was with the founders of UK-based Treehouse Digital Chris Musselwhite and Paul Hamblin who created the award-winning Litterbugs who spoke with great enthusiasm about how their small group of independent filmmakers used VP and a game engine to their creative advantage. “We are really about old tricks using new tools,” said Musselwhite who says that the Epic in-camera VFX tool “blew our minds when we first saw it.”

VP and game engines are “the next step in VFX and cinema trickery” and now with LED screens getting more sophisticated and less expensive there is the opportunity to “match the real world” and deliver that live on set. It was an “exciting opportunity to bring to life things that we would have had to wait until post-production to see.

These new tools help “to expand our story-telling toolkit,” said Musselwhite. While the tech was a bit more expensive, the savings on time and also allowing the director to have exactly what he wanted “down the lens” was definitely worth it. Hamblin said that even without any game experience he was able to “pick it up” through online tutorials. “Don’t be daunted by the terms,” he said. Using their experience with The Biker they described how they pulled together a low-cost process to create and shoot the car crash scene in the Unreal game engine, including some help from a few local suppliers for cameras and lights. Their main takeaway was not to have VP as an afterthought. “Bring in VP from the beginning and elongate your pre-production schedule,” said Musselwhite. “This new tech doesn’t make a bad artist good, but it does allow good storytellers to do more creative things and enhance the story.” They advised filmmakers to take risks and not let this technology only be in the hands of the VFX companies, “everyone needs to understand it,” said Hamblin.

Watch SESSION_3:

The Session four keynote with Sam Nicholson founder and CEO of Stargate Studios featured a close look at the VP behind the HBO series Run, a story that takes place on a train, but was shot completely in a Toronto studio. Instead of traditional green screen shooting, the Stargate Studios team created the landscape passing by the train by using image output from real-time game engines to a live LED wall in combination with camera tracking to produce final imagery, completely in-camera.

“The last five years have seen a big transition to VP but we have been able to predict that for the last 15 years,” said Nicholson. “But the big difference in the last five years has been because of the fusion of game technology and software into cinema technology to move it much more towards real-time which is virtual production,” said Nicholson.

Nicholson was sanguine about the affordability of VP, especially when you hear about the $5 million LED wall used by The Mandalorian. “Where’s the on-ramp for me, to the virtual production highway because all I have is a cellphone or a still camera,” said Nicholson. But the reality is that still cameras are shooting 4K and you can do VP with a 4K still camera a television that you buy at a local electronics store. You can use an HTC Vive controller which is very affordable. “You can learn the basics of VP for a few thousand dollars and then you can scale it up, it’s infinitely scalable from a few thousand dollars to millions of dollars and that’s one of the exciting things about it,” said Nicholson.

Tech aside, one of the biggest issues is the mindset of the filmmakers - the producers, directors and DPs – who traditionally have not had to think about or pay for effects until post-production. “In VP it’s a live event so the entire process of creative decision making has to be done in pre-production,” said Nicholson. “There is a big leap of faith to get producers who hang onto their budgets who have to be convinced to spend the money upfront. It’s a big challenge because it’s a complete flip.”

“The heat in the kitchen of the VP from a visual effects standpoint is much hotter than post produced traditional visual effects,” said Nicholson.

Some of Nicholson’s takeaways for the audience from working on the HBO series Run was to always have a Plan A as well as a Plan B and a Plan C. This way you build in the ability to scale it up or down as needed.

To capture the shots that would be seen out the train windows, they booked a couple of Amtrak train cars to cross the country to shoot 8 X 8K so 400 terrabytes of data in a week! He also talked about lens choice for the LED walls and how to color time between multiple cameras or how to replace backgrounds (like with a green screen) which is one of the things that is not yet available with live VP. “Make sure that you have synchronicity between your screens and your camera or you’re going to be in trouble.”

VP is much more exciting, said Nicholson, because everyone can actually see what is happening in real-time, unlike with green screen. “It’s more immediate and everyone is more involved,” said Nicholson. “It’s like playing a musical instrument and hearing the music in real-time where in green screen you may not know for two weeks that there is a problem, which is a real frustration for visual effects.”

But some visual effects artists won’t like the “rough and tumble” of being on a live set versus doing a color correction in a dark room with complete silence!

There was a learning process on some things like how to keep the horizon steady – they had to build their own inertial trackers and educate the cameramen how to work differently. In the end, they shot an average of 8 pages a day and achieved about 90 percent in-camera, finished pixels without adjustments, which is pretty good!

Nicholson says that the 4 new positions on set with VP are the (1) VP Supervisor who works on pre-production and principle photography and serves as a buffer between the director and producer and the artists; (2) the game engine specialist who is working on the rendering; (3) the colorist who has to be across several DaVinci resolves; and (4) a tracker who is wrangling data, both to track cameras and objects.

Sam Nicholson, Stargate Studios. Photo: Jacob Hansen.

Sam Nicholson, Stargate Studios. Photo: Jacob Hansen.

In the afternoon we began with a look at VP in Denmark by talking first to Allan Luckow the co-founder of The Volume CPH who is interested in bringing the learnings he got from working on Avatar with James Cameron back to Denmark to see what is possible. Real-time filmmaking in Denmark next steps is to add LED walls and more mo-cap. “We will learn by doing,” said Luckow. “We need some bravery and bold people to take some risks and we need to pay to learn a little.”

Session 5: A case study of SAM Production’s The Conqueror with Per Fly, director and Niels Sejer, production designer talked about using virtual scouting and other virtual production tools in developing the historical drama in collaboration with the newly established virtual film studio RIG21. The two discussed how they felt about the new tools like VR while using them, which gave the audience some interesting insights into how an established director and production designer approach the new tech tools and which ones they are the most comfortable with. “It’s extremely useful that I can build a 3D model and look at it and make adjustments even before I make the next model,” said Sejer. “It’s a playground, it’s fun!”

Fly added that they felt that they could both capture the “magic of the moment” but within a framework of planning. The classic mantra, however, still persists, said Sejer: “art meets time, meets money, and that is what we are still trying to do.”

Session 6 was with Aishah Hussain the COO and director of AMUNET Studio, a Danish consultancy that uses real-time technology to “hack the production process”. She spoke about a demo with director of Captain Bimse (2019) Thomas Borch Nielsen. For the demo, AMUNET used the HTC Vive set up for green screening because it is less costly. However, there are accuracy issues because it is an infrared system so it has noise and jitter. Different game engines are better at different things: Unreal works better if you need something quickly to look good in pre-viz and Unity is perhaps better if you are coding because it is a more common language so it works better if you need to do a lot of customisations. She said that to get VP more established in the Danish market there needs to be more case studies so that people can believe in the process. She says “the creatives and the techies need to talk to each other more”.

Session 7 was a case study of Rebel Nature the story universe set in the near future and inspired by the 17 UN Global Sustainability Goals. Created by concept director and world builder Christian Faber who was joined by producer Thomas Lydholm (both are co-founders of VP company Rig21) and VR sculptor Martin Nebelong who showed a demo of Dreams, a creative tool that runs on a Playstation 4 Pro which is part of Sony-owned Media Molecule. Dreams is a low-cost tool to create 3D art by using the games controller.

The session was about how they are producing virtual prototypes for the Rebel Nature IP and how they leveraged the Unity game engine with motion capture, LED screens and camera tracking and working with a real actor! “These kinds of tools will help us explain what we are meaning and what we are talking about and so it will definitely bring us into a more secure and better planned production,” said Lydholm. Faber concluded that all these tools will help with the UN Global Goals as well because these tools also democratize the possibility for storytelling and that could be a big change agent for the world, too.

A huge thank you to the speakers and participants, who joined us for PICTURE THIS_20. Our gratitude also goes to our collaborators and sponsors from Film Workshop Copenhagen, The National Filmschool of Denmark and Danish Producer's Association.

PICTURE THIS_20 was initiated and funded by NORDISK FILM FONDEN and organized by Vision Denmark.

All rights reserved to the original copyright holders.