DaVinci Resolve 18.6 is OUT!

In less than a month since the 18.5 release, Resolve 18.6 it out (pdf for the list of new features)

Many quality of life improvements are welcomed, but I’m mostly/pleasantly surprised by how fast BlackMagic Design incorporated Apple ProRes Log into this release.

It’s not a new software release without AI these days. These 7 AI, excuse me Neural Engine features, will save time and enhance your colors/looks.

It’s great to see that Fusion is getting attentions with new features and improvements in this release because I always felt like Fusion is an afterthought, a checkbox, in the toolsets of DaVinci Resolve. I’m trying to do more in Fusion that I could’ve done with After Effect, for example.

It’s a lot to chew on for .1 release, but that’s the nature of this cutting-edge technology/business. Soldier on!

AI in Avid MediaCentral

Avid is swiftly developing and incorporating latest AI capabilities in its software, and one of its demo videos has captured my attention.

Although it’s not for Media Composer yet, I can see these porting over soon.

AI Color Matching (using AtomX Cast from ATOMOS)

With all the hype around AI for post production, AI for color correction is evolving slower than expected because, as we all know, Shot Match to This Clip in DaVinci Resolve rarely works, if ever for example. For all the latest AI capabilities added to recent DaVinci Resolve, this function is hardly improved upon, or mentioned at all. (I try it time to time though, just for kicks.)

So I was pleasantly surprised when AtomX Cast from ATOMOS unveiled its color matching by AI capability. Although I’m just taking theirs words from the demo videos below, this feature is desperately needed/required for DaVinci Resolve.

I wonder if working with raw camera signals make AI understand colors more better than video data wrapped in various media codecs?

Have you found this app working as their videos claim? Please share your thoughts if you have used it.

Color Pipeline: Virtual Roundtable

This is a great article on the latest color technologies & workflows from not just colorists, but also software developers, vendors, and device manufacturers. I’ve pulled some quotes that I found fascinating from the article.


On Marvel Studios’ finishing setup:

“We use Blackmagic DaVinci Resolve for our central hub of finishing tools as well as Boris FX’s Sapphire and Neat Video. We have also developed our own proprietary tool called Jarvis that can take control of Resolve to perform many editorial tasks, CDL management and media management and can pull on the vast amount of metadata that exists within Marvel to streamline our workflow.”


Using LUTs for projects:

“Yes, we create show LUTs for every project. To be specific, we design our looks within the ACES framework as LMTs, which represent the visual characteristics that the filmmakers want — contrast, tonality, gamut mapping, etc. — while CDLs are used and tracked throughout the project for per-shot color choices.”


On using AI:

“Resolve has some interesting tools that have a machine learning component to them, but Marvel Finishing is not using AI in the way people are talking about it today… Our focus is on quality, and I have not yet seen any AI tools that have helped improve the quality of work. Speed maybe, but not quality.”

“Lately, I have been tinkering with an open-source library called YOLO (You Only Look Once.) This software was originally developed for autonomous driving, but I found it useful for what we do in color. Basically, it’s a very fast image segmenter. It returns a track and a matte for what it identifies in the frame. It doesn’t get everything right all the time, but it is very good with people, thankfully. You wouldn’t use these mattes for compositing, but they are great for color, especially when used as a garbage matte to key into.”

“Some of the AI tools that colorists can use with DaVinci Neural Engine include our new Relight FX tool, which lets you add virtual lighting to a scene to creatively adjust environmental lighting, fill dark shadows or change the mood. There is also the new SuperScale upscaling algorithm that creates new pixels when increasing the resolution of an image. And there are a number of other AI features around facial recognition, object detection, smart reframing, speed warp retiming, auto color and color matching.”

“Right now, we are not offering any AI-enhanced tools for virtual production inside Live FX. Reason for that being that in many cases, you want 100% reproducible scenarios, which is often not possible with AI — especially on the creative side of things… We really see AI as being a technology that does not replace storytelling. It advances capabilities.”


On remote monitoring:

“We use ClearView Glow, which allows us to send an HDR feed to filmmakers all over the world. We recommend people watch using the latest iPad Pro, which is a very good match for our reference displays. However, I will always prefer to have people in the room. So much of our job is reading the room; it’s much harder when that room is a small box on a screen.”

“At the moment, my preference is Streambox. It checks all the boxes, from 4K to HDR. For quick approvals, ClearView is great because all we need on the client side is a calibrated iPad Pro.”


On cloud workflow:

“I can envision a future where we send a calibrated Sony X3110 to a client and then use Baselight in the cloud to send JPEG XS straight to the display for remote approvals. It’s a pretty slick workflow, and it also gets us away from needing the big iron to live on-prem.”


On beyond HDR:

“The XDR technology that Apple is driving with their Pro Display XDR technology is very interesting. Seeing an image’s true color and working in a format that is similar to how human eyes see an object is very exciting. But HDR and SDR workflows are still going to be around and needed for a long time.”


On color in virtual production:

“Color is becoming a much broader aspect in virtual production. It is not only about the color of the digital content provided anymore. It is also about the color capabilities of the LED wall, the LED processor, the monitoring on-set and, of course, accurate color representation of the stage lighting.

Where it becomes challenging is when you have to connect all the different hardware and software parties in a virtual production workflow into one coherent color pipeline.”