Color Pipeline: Virtual Roundtable

This is a great article on the latest color technologies & workflows from not just colorists, but also software developers, vendors, and device manufacturers. I’ve pulled some quotes that I found fascinating from the article.


On Marvel Studios’ finishing setup:

“We use Blackmagic DaVinci Resolve for our central hub of finishing tools as well as Boris FX’s Sapphire and Neat Video. We have also developed our own proprietary tool called Jarvis that can take control of Resolve to perform many editorial tasks, CDL management and media management and can pull on the vast amount of metadata that exists within Marvel to streamline our workflow.”


Using LUTs for projects:

“Yes, we create show LUTs for every project. To be specific, we design our looks within the ACES framework as LMTs, which represent the visual characteristics that the filmmakers want — contrast, tonality, gamut mapping, etc. — while CDLs are used and tracked throughout the project for per-shot color choices.”


On using AI:

“Resolve has some interesting tools that have a machine learning component to them, but Marvel Finishing is not using AI in the way people are talking about it today… Our focus is on quality, and I have not yet seen any AI tools that have helped improve the quality of work. Speed maybe, but not quality.”

“Lately, I have been tinkering with an open-source library called YOLO (You Only Look Once.) This software was originally developed for autonomous driving, but I found it useful for what we do in color. Basically, it’s a very fast image segmenter. It returns a track and a matte for what it identifies in the frame. It doesn’t get everything right all the time, but it is very good with people, thankfully. You wouldn’t use these mattes for compositing, but they are great for color, especially when used as a garbage matte to key into.”

“Some of the AI tools that colorists can use with DaVinci Neural Engine include our new Relight FX tool, which lets you add virtual lighting to a scene to creatively adjust environmental lighting, fill dark shadows or change the mood. There is also the new SuperScale upscaling algorithm that creates new pixels when increasing the resolution of an image. And there are a number of other AI features around facial recognition, object detection, smart reframing, speed warp retiming, auto color and color matching.”

“Right now, we are not offering any AI-enhanced tools for virtual production inside Live FX. Reason for that being that in many cases, you want 100% reproducible scenarios, which is often not possible with AI — especially on the creative side of things… We really see AI as being a technology that does not replace storytelling. It advances capabilities.”


On remote monitoring:

“We use ClearView Glow, which allows us to send an HDR feed to filmmakers all over the world. We recommend people watch using the latest iPad Pro, which is a very good match for our reference displays. However, I will always prefer to have people in the room. So much of our job is reading the room; it’s much harder when that room is a small box on a screen.”

“At the moment, my preference is Streambox. It checks all the boxes, from 4K to HDR. For quick approvals, ClearView is great because all we need on the client side is a calibrated iPad Pro.”


On cloud workflow:

“I can envision a future where we send a calibrated Sony X3110 to a client and then use Baselight in the cloud to send JPEG XS straight to the display for remote approvals. It’s a pretty slick workflow, and it also gets us away from needing the big iron to live on-prem.”


On beyond HDR:

“The XDR technology that Apple is driving with their Pro Display XDR technology is very interesting. Seeing an image’s true color and working in a format that is similar to how human eyes see an object is very exciting. But HDR and SDR workflows are still going to be around and needed for a long time.”


On color in virtual production:

“Color is becoming a much broader aspect in virtual production. It is not only about the color of the digital content provided anymore. It is also about the color capabilities of the LED wall, the LED processor, the monitoring on-set and, of course, accurate color representation of the stage lighting.

Where it becomes challenging is when you have to connect all the different hardware and software parties in a virtual production workflow into one coherent color pipeline.”