SUMMARY_18

SUMMARY_18

 

By Kate Bulkley

The second annual edition of PICTURE THIS_18 took place in Copenhagen on 5 April and saw speakers from across the film, tech and storytelling disciplines. Initiated and funded by Nordisk Film Fonden, part of Egmont. Hot topics included a showcase VR storytelling to how data and AI is changing the face of script writing and the democratisation of film production through new next-generation technical tools.

There is clearly a paradigm shift going on in film production led by advances in virtual production and interactive pre-visualisation that is being adopted by world class film makers from Steven Spielberg (Ready Player One) to Denis Villeneuve (Blade Runner 2049) and Jon Favreau (The Jungle Book).

The conference was kicked off by Paramount Pictures futurist Ted Schilowitz who painted a dynamic picture of the tectonic shifts in the media, creation and distribution landscape. He spoke about the fast pace of change which Paramount stayed ahead of by having a Department of Future (DoF). “I insist on a one-way street from senior executives at Paramount where I and my team live like a start-up inside the studio.” Schilowitz also said that VR is still developing and has not hit on anything great yet. “We are at the Compuserve level of VR devices.” VR has found its way into theme parks and Nintendo’s wrist straps on its VR entertainment does not get enough credit. “I think VR location- based entertainment is next, but we still have a problem with the BOF or box on face. We aren’t going to get to a mass market until we can move beyond the BOF,” he said.

Schilowitz predicted that by 2020 wearable devices will be “smart and light enough to replace the smart phone and we’ll be wearing something that blends with your humanity”. He also talked about virtual screens and that anything can be a screen, much like cinema fans saw Tom Cruise using in Minority Report (directed by Spielberg) as far back as 2002. He also recommended some books to read including Mark Goodman’s Future Crimes.

Ted Schilowitz, Futurist in Residence at Paramount Pictures. Photo: Dorte Tuladhar.

Ted Schilowitz, Futurist in Residence at Paramount Pictures. Photo: Dorte Tuladhar.

A showcase of interactive storytelling from Christine T Berg, the writer and director of Wonder Buffalo, highlighted the opportunities as well as the hurdles involved in writing and filming for VR. Funded by the Entertainment Technology Centre in California, which is financially supported as a technical test-bed by the big Hollywood studios, and produced by Drew Diamond, Wonder Buffalo tells Christine’s own story of being a child of Thai immigrants growing up in California.

Wonder Buffalo has won several awards and was a finalist in the television academy’s new Emmy category for innovation in interactive programming in 2017. Berg and Diamond discussed how they put the entire project in the cloud as well as end-to-end, high dynamic range deliverables and VR production was made in tandem with the linear film programme production. “The ETC was interested in testing cloud-based work flow and in getting the VR piece done not as a separate project by the marketing department but as just another tier of production written by the filmmaker and Wonder Buffalo was a good fit for that,” said Diamond. “The whole idea was if I as the writer director who didn't know anything about VR, what were the questions I would ask,” says Berg. “I told them I was not interested in doing a 360-video cause unless you are an ice skater who loves spinning around, what’s the point of that? So, I got to be real bratty about it and think about how to push past some of the ways things were being done. And that’s how we got involved in the volumetric and photogrammetry.”

Rainer Gambos who was the VFX supervisor of Game of Thrones and Cosmos came on to help Wonder Buffalo with virtual production and they also had advice from Alex McDowell, who was the production designer for Fight Club and Minority Report and who runs the world building institute at University of Southern California. McDowell believes that any story should be able to have a ‘story bible’ created that will allow that story to live across different media. “In discussing what is good for VR, which is about space and exploring space more deeply,” said Diamond, “we realised we could make a companion piece that didn't require you to see the film first to understand it.”

Moderator Kate Bulkley with writer/director Christine T Berg and producer Drew Diamond. Photo: Dorte Tuladhar.

Moderator Kate Bulkley with writer/director Christine T Berg and producer Drew Diamond. Photo: Dorte Tuladhar.

The benefits of story building and creating a world for the story was discussed in depth by the next two speakers, Simon Jon Andreasen, the head of the National Academy of Digital Interactive Entertainment (DADIU), and Jakob Ion Wille, who is an associate professor at the Royal Danish Academy of Fine Arts School of Design.

The two spoke from a more academic perspective about the narratological phenomenon of putting both space and plot into story building. They both agreed that story building as an approach is more productive because it is much be less hierarchical, more collaborative and less silo-ed in order to succeed in the new multi-media, transmedia world of modern story-telling. They have worked with students on story world building, including projects using game engines, including Unity. They are now evangelists of this approach to story-telling and they have also created a world building club.

Simon Jon Andreasen and Jakob Ion Wille. Photo: Dorte Tuladhar.

Simon Jon Andreasen and Jakob Ion Wille. Photo: Dorte Tuladhar.

Christian Faber described his journey from the commercial advertising world in which he had Lego as a client for 28 years before finding his own production company Goodman Rock to bring into reality a story about a world that he calls the Rebel Nature Universe. Started in 2015, Faber has taken his learnings from creating and helping launch the original Bionicle toy line which was Lego’s first self-produced IP that goes beyond just being a series of toys, and applied this to creating a new kind of story-telling that is shared and so unpredictable.

Faber’s ambition is to tell a story that is by nature open and unfinished. He illustrated this point at the conference by launching a large orange balloon into the audience and seeing how different people responded to it, hitting up into the air or letting it fall to the ground. For Faber, this is the power and unpredictability of the audience that he wants to include in his story-telling. The main character in the Rebel Nature Universe is Res Waveborn who is a human baby raised by a robot. “I want this story to be ongoing so that my children and their children will help create it,” said Faber. Bionicle is the toy that saved Lego, Faber is hoping that Res Waveborn will be used to help save the planet.

“What is the purpose of creativity. I don’t think it’s shop until you drop. We have human rights and human responsibilities for the planet,” he said.

Christian Faber. Photo: Dorte Tuladhar.

Christian Faber. Photo: Dorte Tuladhar.

Data and AI are impacting how entertainment is made. Netflix and Amazon use big data to tailor their commissioning. IBM’s Watson AI has been used to create movie trailers. In this session, Emmanuel Kuster, the project manager at The Fiction Lab, talked about “the science of audience satisfaction”.

The Fiction Lab is a research programme where cognitive science is applied to the content industry. “We are helping reduce the risk of low audiences for key content,” said Kuster. “We also use cognitive science to help create story arcs that are emotion builders.”

Nadira Azermai, the founder and CEO of Scriptbook, and Michiel Ruelens, chief data scientist at Scriptbook, provided details about their new AI-powered, screenplay analysis and forecast tool. They told the conference about automating the Bechdel Test which tracks whether a work of fiction features at least two women who talk to each other about something other than a man. It can track violence and map the ‘likeability’ index for characters. “We wanted as data scientists to dig deep and see what we could correlate with financial performance and what resonates with audiences,” said Azermai.

Azermai pointed to the need for better commercial understanding of how a script will play with audiences before it’s finalised. “The film industry in Europe is in deep shit. We need to figure out more commercial projects that might travel, rather than depending on public money to fund culturally relevant European projects,” she said.

In an ensuing panel discussion with Nikolaj Scherfig, chairman of the Danish Writers Guild and a script writer himself, there was general agreement that this kind of new tool might help writers produce better episodic series but it might have a “dark side” for human writers. “What I am afraid of is that you can mandate things that may not be in the best interest of the story or of society,” said Scherfig.

Moderator Kate Bulkley with Nadira Azermai, Nikolaj Scherfig and Emmanuel Kuster. Photo: Dorte Tuladhar.

Moderator Kate Bulkley with Nadira Azermai, Nikolaj Scherfig and Emmanuel Kuster. Photo: Dorte Tuladhar.

Thomas Borch Nielsen, film director, writer and VFX supervisor, presented a case study of the upcoming Danish children’s movie Captain Bimse showcasing how new filming tools can help lower costs and not compromise the story. These included using the game engine Unity to create virtual film scenography; 3D software Blender to build fixtures; Da Vinci Resolve for editing; and Fusion 9 for compositing, all of which help lower the barriers to visual expression.

Borch Nielsen said that having access to these tools, which are largely free, provides greater visual and narrative scope to any filmmaker.

Thomas Borch Nielsen. Photo: Dorte Tuladhar.

Thomas Borch Nielsen. Photo: Dorte Tuladhar.

Real-time pre-visualization is becoming increasingly affordable and easy to apply in virtual production processes, according to Mirko Lempert, assistant professor for visual media at the Stockholm Academy of Dramatic Arts and Martin Christensen, an artist, creative director and teacher in Sweden.

Together, they have built a system that utilises game engines and allows directors, writers and cinematographers to interact with CG environments and objects in real time. These tools can be applied in an early stage of script development and include a VR sculpting tool, all which help inform the narrative with visual feedback. “These tools help prepare the actual shoot in a much more efficient and visual manner,” said Lempert.

The interactive pre-visualization system in action. Photo: Dorte Tuladhar.

The interactive pre-visualization system in action. Photo: Dorte Tuladhar.

The closing keynote was presented by Habib Zargarpour, the chief creative officer of Digital Monarch Media who worked as a VFX supervisor, production designer and art director on films including Blade Runner 2049, Series of Unfortunate Events, Ready Player One and Tom Hanks’ current film Greyhound and sees the introduction of game-engine based cinematique tools as part of a paradigm shift in film production. Co-presenting with him was Julia Lou, an R&D software developer at Double Negative in London.

They did a series of very dynamic demos where he showed how a virtual camera was linked up to the Spinner flying cars in Blade Runner 2049. Director Denis Villeneuve used this to solve a couple of problems, not least of which was a tight time frame to finish the film. “When a few shots of the Spinner flying needed to be put together this was going to be tough animation-wise and these things aren’t easy to pin down using traditional tools aka key frame,” said Zargarpour. “We needed a real time solution.” Zargarpour hooked the Spinner to a virtual camera using the cinematography tool he created called Expozure so wherever the director pointed the Spinner it would fly. ”The finished shot has a real dynamic feel, it feels alive because both the flying and the camera were performed.”

Moderator Kate Bulkley with Habib Zargarpour and Julia Lou. Photo: Prami Larsen.

Moderator Kate Bulkley with Habib Zargarpour and Julia Lou. Photo: Prami Larsen.

Zargarpour also did live demo of the tool, which essentially replaces the motion capture stage. It is basically two game controllers, hooked to a rack holding a Lenovo camera phone and running software written by Lou. He was able to run the tool live, capturing shots and changing the scale of the shot in real time. After he wowed the audience with this, he also showed how he did a build-up of the waterhole shot in The Jungle Book film, where Zargarpour oversaw the use of real-time game engines to help render scenes. “Rendering vs real-time production is a completely different thing. On The Jungle Book, what we could do was a eureka moment for the director,” said Zargarpour.

Habib Zargarpour demonstrating the Virtual Film Tools used by Denis Villeneuve to create some of the shots in Blade Runner 2049. Photo: Mariann Nederby.

Habib Zargarpour demonstrating the Virtual Film Tools used by Denis Villeneuve to create some of the shots in Blade Runner 2049. Photo: Mariann Nederby.

A big thank you to the speakers and participants, who joined us for PICTURE THIS_18. Our gratitude also goes to our supporters and collaborators from Creative Europe MEDIA Desk Denmark, Copenhagen Film Fund, The Royal Danish Academy of Fine Arts Schools of Architecture, Design and Conservation (KADK), The National Academy of Digital Interactive Entertainment (DADIU) and Aalborg University, Copenhagen Campus.

PICTURE THIS_18 was initiated and funded by the NORDISK FILM FOUNDATION and organized by FILM WORKSHOP/COPENHAGEN.

All rights reserved to the original copyright holders.