Premiere Pro v24.3 is out (& AI preview!)

Adobe released new version of Premiere Pro and Premiere Gal did an excellent overview of its new features, as always.

However, Adobe seems to be more interested in teasing the Generative AI-everyone’s favorite tech buzzword, similar to what it did within Photoshop.

The demo looks promising, but will it be production ready? If you used the generative fill of Photoshop as of today, you know the answer.

It may be good enough initially as an ideation, a preview, or a quick fix for minor issues that required time or money. Generative AI is ultimately another tool in visual effects, but people seems to struggle with the idea of it, let alone define its usage beyond generating pretty images that look almost real. (uncanny valley, anyone?) More time & money is required for the development of this technology, more than the adoption at this stage in my opinion.

On a related note, I always thought that morph cut was a generative AI function; it is creating frames where there were none. And it waddles in the morally gray area like creating a Frankenbite, especially in documentaries, but not many seems to be bothered by it. I foresee this will happen with the generative AI when it’s more widely adapted and used.

In conclusion, we’re entering a whole new world/era of visual storytelling where you can’t be certain the image you see is 100% real, just like when Photoshop was introduced decades ago.

How Animated Movies Are Edited (featuring Batman)

Editor Andy Young presents informative & hilarious takes on how he utilized Adobe Productions for Premiere Pro to edit the feature animation, Merry Little Batman. It’s a must-watch if you are an editor aspiring to edit animations like yours truly.

I’m little surprised that he didn’t mention other Adobe programs like Photoshop or After Effects, that are often cited as another strong consideration to edit in Premiere Pro, thanks to Dynamic Link.

However, his interview on editing Harley Quinn season 3 do mention them, so he may have left that out in this presentation to focus on the editing parts.

Editing Billions Club featuring Jung Kook episode #TimelineTuesday

As you may know, I edited the latest episode of Billions Club: The Series, featuring Jung Kook for Spotify recently. This was a fun editing experience that posed interesting challenges because it’s a 9x16, vertical video. (I haven’t worked on this format for awhile, despite its popularity on Instagram, TikTok, Youtube Shorts, etc.)

So it’s my pleasure to share my first #TimelineTuesday of 2024, below.

The final Premiere Pro timeline of Spotify | Billions Club: The Series featuring Jung Kook

I figured this timeline setup requires some context because it’s not just cuts of videos & audio.

  • V1: A solid red background to check if a video covered 9x16 frame. If not, blaring red will be obvious in a simple glance.

  • V4: An adjustment layer for the temp Alexa LUT. All the graphic elements sit above this track.

  • Top Track: English subtitle track. The entire dialogue is in Korean, so I translated & subtitled them. There were four tracks of Instagram & TikTok UI guides underneath the subtitle track to position subtitles as well as graphic elements. The guides were removed before the turnover to a colorist.


I knew that this will be the most popular video that I edited by far, and yet the numbers surprised me.

44.2 million views on Instagram in 2 weeks! (14.2 millions views on TikTok as of today, 01/02/2024)

Fun Fact: As I was working on the first cut of this episode, Lisa’s episode dropped and saw that it starts with the similar shot that I had at the moment; the star pops out from left side of the frame. Because Jung Kook was the next episode in the series, I couldn’t start the episode with that shot. Haha…

So the first shot is his thumbs up as you see now, and his entrance shots start later in the episode after establishing NYC shots. I used more of his thumbs up shots to indicate an end of segment-like a period to a sentence, but they were removed as it felt repetitive and redundant.


All in all, I look forward to more shots at editing 9x16, vertical videos, as well as more projects with K-Pop stars in 2024. Fighting!

An Apple Event Shot on iPhone

As you probably have heard/seen already, Apple held the first evening keynote/event to announce M3 Macs on October 30th.

I’m already asking: M3 Ultra, when?

But the real surprise, or One More Thing in Apple parlance, was the last card at the end of the event stream.

It almost overshadowed the new M3 chips & Macs and proceeded to generate discourses on what Apple really meant by “shot on iPhone.”

Apple-the master marketers of technology, released the following video as if they were expecting the discourses. And the video revealed the production & the post-production workflows of their events for the first time, because it highlights their biggest product, even if it steals spotlights away from new Macs. And the accompanying article details more with the behind-the-scenes photos.

The biggest takeaway for the people, most notably the post production professionals using Final Cut Pro X, was that Apple didn’t use its own NLE software to make their biggest videos. Haha…

Its post production workflow appears to be (and remains to be, I might add) Premiere Pro + After Effects (not shown in the video), and color + finish in DaVinci Resolve.

However, I wasn’t a least bit surprise because I worked at RadicalMedia, the production company behind the Apple virtual keynotes & presentations since its inception (due to the pandemic). It may haven’t been a public knowledge until Apple acknowledged & credited the production company in the video. No, unfortunately I didn’t get a chance to work on it but I’ve talked to coworker(s) about their experience (without knowing that I will be working on an Apple TV+ project later).

I know that the production chose Premiere Pro to edit, neither FCP X or Avid, because it offered faster turnarounds of GFX & VFX with After Effects integration.

A real surprise to me was the new BlackMagic Camera app, because the first Shot on iPhone 15 commercial was the Olivia Rodrigo music video, which used the default iPhone camera app.

Either the Blackmagic camera app wasn’t available at the time of filming, or Apple doesn’t want to show another company app in its commercial for average consumers.

All in all, this has been interesting turn of events for video professionals as Apple embraces and incorporates more professional features, gears, and apps to highlight its growing platforms, rather than pushing/forcing their own solutions.

Because at the end of the day, Not Invented Here is a dangerous path for a big tech company like Apple.

p.s.

Stu Maschwitz wrote more in-depth article about what does and doesn’t matter about Apple shooting the event with iPhone 15 Pro Max.

“Apple wants to sell iPhones, and to accomplish this, they spared no expense to put the very highest image quality on the screen.”

Yes.

Latest Updates to Premiere Pro & Avid Media Composer

The end of 2022 is upon us, but the post (software updates) don’t stop!

You can learn about the latest updates to Premiere Pro & Avid Media Composer at your leisure with the following videos.

As always, Premiere Gal has a great overview of the latest updates to Premiere Pro.

Avid pushed a great update for Media Composer before the end of 2022.

Jack, the Avid Assistant has more detailed look into the latest Avid Media Composer update.