Get to Know... Felix & Paul Studios

Team Shotgun    ●    Jun 4, 2019
We recently caught up with Hans Payer, Lead Pipeline Developer, and Sebastian Sylwan, CTO and Creative Partner, at award-winning immersive entertainment company Felix & Paul Studios to discuss how their teams across departments use Shotgun on cutting edge virtual reality, augmented reality, and mixed reality experiences. Felix & Paul Studios combines technological innovation with artistic storytelling to create groundbreaking experiences ranging from the original comedy MIYUBI, to feature film tie-ins with blockbusters such as “Jurassic World,” to documentary projects with leaders such as President Barack Obama and First Lady Michelle Obama. Most recently, the studio released the documentary “Traveling While Black”, a stop-motion animation “Gymnasia,” and the first two installments of their “Space Explorers” series. Read on to find out more about how Shotgun helps the studio streamline a variety of projects, including the unprecedented ISS Experience currently in production. Tell us a bit about Felix & Paul Studios. Sebastian: Felix & Paul Studios is an EMMY® Award-winning immersive entertainment studio headquartered in Montreal with an additional office in Los Angeles. We are the world’s only full spectrum immersive entertainment studio with end-to-end creative capabilities, technological know-how, and proprietary tools all within one company, and we work with leading tools from spherical 3D camera systems to spatial audio capture to deliver the best quality experience in all of our storytelling.




When did Felix & Paul Studios start using Shotgun and how are you primarily using it?


Hans: Shotgun was already being used when I joined the company in December 2015. It was used at the time for task management, in the compositing department and to track editorial changes. Since then we have expanded the use of Shotgun in other departments such as I/O, Integration, Audio and CG. We also leveraged and customized functionalities from the Shotgun Toolkit and widened the amount of tracking workflows. Shotgun is used by producers, coordinators, editors, compositors, integrators, I/O technicians, audio engineers and quality assurance analysts totaling approximately 25 employees, out of our 60 total.


What other tools are used in your pipeline? Do you develop any proprietary tools, and if so, how do these tools integrate with Shotgun?


Hans: What's unique about Felix & Paul Studios is that we are a home for all types of creators; directors, editors, compositors, interactive designers, technologists, and more. We create our experiences from end to end, and this presents interesting pipeline challenges. We think of our edits as different types of stories (e.g. teasers, trailers, special featurettes), which can constantly change, have many people involved, and have multiple contexts associated with them too. Because of this complexity, Shotgun helps us streamline projects for all the different players who need to use it.


Most often we begin projects in Adobe Premiere, with the selection of key shots for a story. This then gets integrated into Shotgun for the other members of the team to use. Throughout the production, stories will change multiple times; timing or content may change.


For example, in our live-action experiences, our compositing department delivers the final rendered elements (i.e. the content), but artists are not constrained or directly impacted by editorial changes. The content they create may be used in many stories with different timing without artists being directly concerned. Once the story is finished, we link Shotgun to a Story Encoder tool which we developed in house. We deliver our projects to multiple platforms such as Oculus Go, Oculus Rift, Samsung Gear VR, Facebook 360, and YouTube 360. For each platform, we have to generate a single encoded movie that spans the entire length of the experience. The Story Encoder tool allows us to produce encodes of our content with our versioned elements and dynamically link timing based on a chosen story version. Compositors do not need to be concerned about specific editorial changes, they simply deliver their content. Prior to final encoding, we segment the experience per shots; when changes happen in editorial, we can easily identify which shots have changed and avoid reencoding those which did not. Story Encoder and Shotgun help us organize the information and achieve our workflow goals and minimize errors and time.




What are your favorite features of Shotgun?


Hans: The Toolkit’s Standalone Publisher. We were already thinking of developing our own standalone publisher prior to the release of tk-multi-publish2. It saved us a lot of development time when it was released. It opened the door to other departments that were not using traditional digital content creation applications. We use the Standalone Publisher extensively to validate and secure our data. In doing so, we minimize human and technical errors and avoid losing time with back and forth communications between departments.


Can you describe a recent project(s) where using Shotgun was essential?
 


Sebastian: Shotgun is essential for every project. Our directors, whether in-house or external, can change the edit at any time all the way to the last minute. Tracking editorial changes and having the ability to update information for other processes is essential. Organizing our versioning conventions and having everyone on the same page is also important. We also have many uses of Shotgun to centralize configurations for many processes and applications. Having this information in Shotgun allows anyone to modify or query it easily. For our recent VR project going behind the scenes on Wes Anderson’s stop-motion film “Isle of Dogs,” we also used Shotgun to validate the ingest that was being synced daily from the studio in London. We often incorporate metadata from the set even before the footage gets ingested.


We also recently announced the ISS Experience, in partnership with TIME, which is currently in production. For this VR experience, astronauts from several countries aboard the International Space Station are spending a year filming their missions, including spacewalks, to provide an unprecedented firsthand look inside the ISS. I believe this project brings the most complex logistics and technical challenges so far for Felix & Paul Studios. We designed a workflow centered around Shotgun for this project which streamlines the tracking, selection, and downlink of footage from the ISS. It’s great because Shotgun enables the entire approval chain to easily review specific sections and only downlink relevant footage, which not only saves us time on the ground but also lets the astronauts continue with their mission with minimal disruption.




Why is it important to focus on building a strong pipeline? How have you found that VR pipelines differ from traditional VFX pipelines?


Hans: The strength of your pipeline translates to studio-wide performance. The better the workflow design is, the more it will minimize errors, avoid unnecessary communication and therefore improve productivity.


There are many similarities between VR and VFX pipelines. The main difference in the case of Felix & Paul Studios comes from the fact that the studio provides a complete solution, not only post production. In any steps – shooting, data transfer, post, integration, player application – pipeline is called to help resolve problems and design workflows not present in VFX. Also, every project has their own challenges; VFX workflows change less from one project to another.


What inspires you/your teams in your work?


Hans: Immersive media is not just like watching TV or going to the movies. It's a different experience. The emotional connection is different than any other storytelling medium. You can put on a headset and feel like you travelled for 15 minutes to the other side of the world and back. The novelty of doing this type of immersive content is very inspiring.


I am also proud to be part of a studio that is known as the “gold standard of cinematic virtual reality” according to John Carmack, the CTO of Oculus. The studio sets the quality of our experiences at the highest level in every step of our workflow. The reaction from people viewing our experiences is no less than amazement. It is also exciting to be challenging people’s thoughts on VR. When people think about VR, they don't think instinctively of cinematic, but through our work, we've opened the doors to a different type of VR experience. I believe that everyone in the company is proud because of this.


I am also driven by a studio that is constantly evolving and changing. There is always yet another door to open. Yesterday it was VR (and there are still many doors we are planning to open there), but today it includes AR and MR. Felix & Paul Studios is all about aiding narrative experiences through interactive content, and we will apply the same finesse and level of narrative from cinematic VR in our 3D interactive as well. There's always something to look forward to.




What’s next for Felix & Paul Studios? Are there any upcoming projects you can mention?


Sebastian: In addition to the ISS Experience mentioned earlier, in the next few weeks we will announce our first mixed reality project targeting Magic Leap, which I believe will be a game changer. We’re proud to be selected by Magic Leap as part of their inaugural class of content creators, and look forward to using augmented reality to further expand how stories are told.