SUMMARY_19

SUMMARY_19

 

By Kate Bulkley

The 2019 edition of PICTURE THIS, focused on the transformative power - and increasing accessibility - of evolving filmmaking tools that are re-imagining the film-making process. From performance capture to new production processes that allow CGI, previz workflows and game engines to come together, the conference included case-studies, presentations and even a live experiment.

The day’s discussion was around technologies and techniques that are allowing filmmakers to tell stories in new ways. As Jon Landau, the producer of Avatar, told the audience in a video link: the point of the conference is to “inspire people to use technology to tell stories that can’t otherwise be told.”

LA-based Daniel D. Gregoire, the founder and chairman of Halon Entertainment, started the day with a look back at how technology advances from the first computer-generated visual imagery in the 1976 film Future World where “they programmed a virtual hand by manually programming in every point” to Jurassic Park in 1993 where VFX became an unnoticeable part of storytelling.

Gregoire emphasized the point that technology alone is not enough to make an impact, recalling that it took George Lucas insisting on digital acquisition to bring it into the mainstream: Star Wars Phantom Menace in 1999 was the first ever digital acquisition for a major tent pole film. “This was super important for digital production and all the tech we use today on set,” said Gregoire.

A theme that would be emphasized throughout the conference was the growing cross-over of video games technology and filmmaking tools. According to Gregoire: “We hire more games makers than film makers these days.” Andy Serkis’s performance as Gollum in Lord of the Rings is considered the breakthrough for motion capture but now an iPhone 10 can be used as a motion capture device, further democratizing the tools needed for performance capture. “It doesn't take huge cameras and giant head cams anymore to pull off facial capture,” said Gregoire. “We set up each of our artists with an iPhone 10 to do capture for visualization or for game cinematics. As you get the software and the plug in for Unreal or Unity… the actual capture itself is not hard.”

Daniel D. Gregoire, Founder and Chairman of HALON Entertainment. Photo: Kristian Ridder-Nielsen.

Daniel D. Gregoire, Founder and Chairman of HALON Entertainment. Photo: Kristian Ridder-Nielsen.

Accessibility of advanced technologies is impacting the speed and the kind of storytelling possible and is fueling the rise of virtual YouTubers or VTubing, which has taken off particularly in Asia, but the rise of virtual personalities is now a global trend.

Halon did a project with Mattel bringing Barbie into the v-tubing world using performance capture technology. “We were able to work with Mattel and their writers and America Young, the actor who plays Barbie, and write and shoot these (video blogs) in less than 6 days and three of those days were for Mattel approvals,” explained Gregoire. “We have the technology now so we can stream these live directly onto YouTube.” The power that it gives storytellers and VTubers is amazing – people can create virtual personalities.

There are similar huge strides being made in real-time virtual production where “it no longer takes an army of people now, a lot of it is on your computer,” said Gregoire. An app can be downloaded to your iPad from Epic Games and shoot a 3-D scene. “I have shot entire movies of pre-viz using virtual camera. The whole of World War Z I shot on virtual camera,” said Gregoire. “The director and cinematographer wanted a found footage/war documentary/cinematic overlay-style-camera and the only way to do that was to set up large sets ups and be able to stand on the corner and capture it in a real, honest way and react to the environment not make it up and the virtual camera allows you to do it… The virtual camera is a very powerful tool and is very freeing for the storyteller.”

The key is the rise in the use of game engines, primarily Unreal and Unity. Since 2015 when the Unreal game engine was made free to use there has been a huge step change in usage – the images being created are useful for proving ideas, getting financing and screening with audiences, says Gregoire. Halon has now done 50 projects using game engine technology, and it’s not just for previz. In Borderlands III series, Halon used the Unreal game engine to render final visual effects.

It also underpins the rise of the virtual art department whose job it is to find and create digital assets that can be used and then warehoused, in effect creating a “digital back lot” because digital assets have a value. Gregoire also underlined some of the newer – and still expensive – techs like holographic capture that capture high frame rates per second and generate terabytes of data. And, also machine learning and AI, that will allow facial replacements, that could be used to bring a dead actor back to life, for instance, not to mention its nefarious use in deep fakes.

PICTURE THIS_19. Photo: Kristian Ridder-Nielsen

PICTURE THIS_19. Photo: Kristian Ridder-Nielsen

Some of the technologies techniques described by Gregoire was put to use in an live experiment created by Simon Jon Andreasen the head of director’s education (animation and interactive) at the National Film School of Denmark and Danish director Rasmus Kloster Bro who had not used motion tracking and game engine technology before, to capture a live performance and build a universe during the day of the conference. The resulting clip showed up at the last minute, but the experience was a real learning experience for all involved, from the complexities of setting up the game engine technology to the actors’ concern about what this technology might mean for their profession.

The process and learnings from the experiment is explained in this video produced by Amunet Studio.

The use of a game engine and low-cost (some free) software in production was the focus of a case study by the co-directors of Danish animated children’s film Captain Bimse, Kirsten Skytte and Thomas Borch Nielsen.

The directing duo gave an insightful view but said that their experience was daunting because this is the first hybrid live action film to use a game engine. Creatively they wanted to mix live action and CGI together to create a film with a look of “magic realism”. But with 1,200 VFX shots on a budget of a $2 million the Unity game engine was pivotal to lowering the render time and the cost. They also used a camera that did autofocus (that could be controlled from a mobile phone) and 4K.

They had a short time frame for shooting and so they built a 360-degree green screen but also added ‘real’ items like Captain Bimse’s yellow airplane. They also used Unity’s ‘real tree’ software to create ‘fast’ tree building as well as photogrammetry – taking information from photographs to recreate them digitally – to make realistic-looking objects like rocks. Open source 3D software called Blender was used to animate the teddy bear’s mouth when he is speaking as well as real-time lip-synching by using a facial recognition app. Editing, grading and sound were also done in 4K with DaVinci resolve, a piece of $200 software.

Despite the use of tech there was still a lot of sweat equity in the project and they admitted to making certain mistakes like mis-calibrating the real 35 mm camera lens to work in the Unity engine. In the end the production was six months longer than expected and about $1 million over budget. “The end of this story is don't be a first mover,” said Nielsen with a chuckle. “But if you build in the right preproduction time then when you are on the set you are not a first mover, because you know your tools.”

Directors Kirsten Skytte and Thomas Borch Nielsen with moderator Kate Bulkley. Photo: Kristian Ridder-Nielsen.

Directors Kirsten Skytte and Thomas Borch Nielsen with moderator Kate Bulkley. Photo: Kristian Ridder-Nielsen.

Virtual production was a theme that was further explained by Mariana Acuna Acosta, the CPO and co-founder of LA-based Glassbox Technologies. While large studios have been able to afford real-time virtual production tools, including pre-Viz, motion capture and volumetric capture and post-production, the democratization of these tools has now arrived as costs have fallen and accessibility has increased. “The old idea of fixing it in post-production is expensive and you also lose the director’s voice,” explained Acosta.

There is a whole suite of different pre-production tools to redefine the production pipeline and creative workflow both on and off set, says Acosta. The preproduction tools range from previs where you map out what you are going to do before you start production, techvis, where you map out the dolly cranes, lens types as well as the data around each camera shot, to scoutvis, for capturing locations to stuntvis, where the stunts are done in the game engine before you get to the set. There is also postvis, where some shots are used in the final product. But as Acosta said, there is still a need for professional expertise around how all these tools work because assets can be hard to track, especially different versions, which game engine is best for any particular project and how to integrate with more traditional film making tools like Maya.

Mariana Acuna Acosta, CPO and co-founder of LA-based Glassbox Technologies. Photo: Kristian Ridder-Nielsen.

Mariana Acuna Acosta, CPO and co-founder of LA-based Glassbox Technologies. Photo: Kristian Ridder-Nielsen.

French former diplomat and film director Antonin Baudry spoke about his directorial debut film Le Chant du Loup (2019) which has opened to critical acclaim in France and was recently bought by Netflix. The film draws on the idea of nuclear deterrence and where there might be failings in the system. In particular, in a nuclear submarine where there is a “golden ear” who is the person whose ear is depended upon to detect threats and is clearly fallible.

Baudry was keen to make the film as real as possible and did not use many special effects tools but he did use CGI and in camera shooting to create the close quarters of a real submarine, the on-screen graphics as well as explosions and some underwater shots. “I wanted to minimize the digital part of the firm because I wanted the accidents to happen on set. I had to use some CGI but when I spoke to my CGI company Territory Studios, they showed me a boat that they had created next to a real one and they were super happy that their boat looked more real. And that is exactly what I wanted to avoid because I want it to be really real,” said Baudry. “Some of the CGI shots were too smooth… I think it is better if you integrate the real movement. The more your camera captures the real world is more interesting to me. It’s a personal choice.”

Moderator Kate Bulkley and director Antonin Baudry. Photo: Kristian Ridder-Nielsen.

Moderator Kate Bulkley and director Antonin Baudry. Photo: Kristian Ridder-Nielsen.

Producer Drew Diamond gave us a deep dive into performance and volumetric capture and how important it is. The Facial Action Coding System (FACS) says there are six faces that are universal to all cultures: sadness, anger, contempt, disgust and fear. Regardless of language and culture these expressions can be understood by anyone. Technology is developing on how to create virtual humans by capturing actors with different technologies, some more ‘real’ than others. The question is how authentic do they need to be? He outlined different processes, from a volumetric capture of a particular actor doing all the different expressions to facial capture where a virtual ‘skin’ of an actor is created and then put on facial animation rig with FACS built in. “The higher the fidelity of the image the higher the cost,” said Diamond. “Holographic capture is better for virtual conferences at the moment but if you want to use this for YouTube there are tools that will work for that.”

Producer Drew Diamond. Photo: Kristian Ridder-Nielsen.

Producer Drew Diamond. Photo: Kristian Ridder-Nielsen.

The final panel included producer Drew Diamond, the CPO of Glassbox Technologies Mariana Acuna Acosta, Halon Chairman Daniel D. Gregoire, Danish director Per Fly and screenwriter and director Isabella Eklöf surfaced some of the themes that stood out across the day.

Some of the issues were how to be authentic as well as what tools are useful, as well as how to use them and what all these techniques will cost.

Per Fly is working on a production with HBO about Copenhagen in 1870 and he is physically building some sets but he would think about virtual set extensions and virtual ‘extras’ and also how to move some of the work from post to preproduction. The question of cost is clearly on the minds of all creatives and Gregoire said sagely that “With great power comes great responsibility. You have to use the tools wisely.”

Screenwriter and director Isabella Eklöf is very interested in the facial mapping and about creating stylized worlds. “But I don't think the faces are good enough yet except for stylized worlds,” she said. Acosta said that it’s important to remember that a lot of the current facial capture is not meant to be photorealistic and it’s about finding the correct use for the technology. She also explained that there is a lot of de-aging technologies out there already but recently brought into the mainstream by Martin Scorsese’s The Irishman and the upcoming Will Smith film Gemini.

Director Per Fly. Photo: Kristian Ridder-Nielsen.

Director Per Fly. Photo: Kristian Ridder-Nielsen.

Acosta also had some advice for first-time users of motion capture: make sure your motion capture takes are short cause the rendering still takes quite a lot of time! And Gregoire says that motion capture acting is different and there are skills that need to be learned to do it properly.

As a good final view, Gregoire, even though he works primarily in digital he needs to understand what “the ground truth of something is to understand it so I can break it. It’s like knowing the real filmic rules before I break them.” We ended with some of the ethical questions about as we virtualize reality does this put a special value on real things and real experiences? Acosta said that already actors and celebrities are creating contracts about how their virtual image can be used or not be used. Drew Diamond added that it’s not just your whole image but someone’s smile or the shape of her brow. “So, then we get to deeper issues about what defines you – is it something beyond how you look?”

Added Acosta: “We will all probably have avatars and be able to live in virtual worlds, but it will be our choice. With the rise of AI what makes us valuable as humans is that we are humans!”

Writer/director Isabella Eklöf, Daniel D. Gregoire, Mariana Acuna Acosta, Drew Diamond, and Per Fly. Photo: Kristian Ridder-Nielsen.

Writer/director Isabella Eklöf, Daniel D. Gregoire, Mariana Acuna Acosta, Drew Diamond, and Per Fly. Photo: Kristian Ridder-Nielsen.

A huge thank you to the speakers and participants, who joined us for PICTURE THIS_19. Our gratitude also goes to our supporters and collaborators from Film Workshop Copenhagen, The National Filmschool of Denmark, Samsung Media Innovation Lab for Education (SMILE), Aalborg University Copenhagen, Rokoko Electronics, MAAN Rental, Act3, TGBVFX, and Amunet Studio.

PICTURE THIS_19 was initiated and funded by NORDISK FILM FONDEN with support from Copenhagen Film Fund and organized by Vision Denmark.

All rights reserved to the original copyright holders.