Post Production – ProVideo Coalition https://www.provideocoalition.com A Filmtools Company Sat, 04 Jan 2025 16:23:16 +0000 en-US hourly 1 https://wordpress.org/?v=6.3.5 https://www.provideocoalition.com/wp-content/uploads/cropped-PVC_Logo_2020-32x32.jpg Post Production – ProVideo Coalition https://www.provideocoalition.com 32 32 4:3 programs upscaled to HD or 4K: technical & aesthetic decisions https://www.provideocoalition.com/43-programs-upscaled-to-hd-or-4k-technical-aesthetic-decisions/ https://www.provideocoalition.com/43-programs-upscaled-to-hd-or-4k-technical-aesthetic-decisions/#comments Wed, 01 Jan 2025 20:31:14 +0000 https://www.provideocoalition.com/?p=287504 Read More... from 4:3 programs upscaled to HD or 4K: technical & aesthetic decisions

]]>
I have previously covered some of the options when upscaling 4:3 programs to HD or 4K while retaining the original aspect ratio. One of those decisions is whether to eliminate headswitching via cropping or masking (or not). Another is whether to deliver the 4:3 material with a presumed pillarbox… or to place it inside of the image of a classic CRT TV set. In this new article, I’ll cover the other options when desiring a presumed pillarbox when determining the actual resolution of the deliverable file. For example, if a 4:3 program is to be upscaled to 1080p while retaining the original aspect ratio with a presumed pillarbox, the 1080 number refers to 1080 vertical pixels, not the horizontal pixels. We actually have two options in this case: 1440×1080 with all active square pixels on a borderless canvas… or a 1920×1080 canvas where the pillarbox is actually part of the canvas. Let’s explore the pros and cons of each.

1440×1080

(all pixels active, all square)

with a borderless canvas

1920×1080 canvas

where the pillarbox
is part of the canvas,
plus 1440×1080
in the center of the canvas

File size

lower
(more efficient)

higher
(less efficient)

Compatible with YouTube?

Not officially documented,
but currently works,

per this great example.

Of course!
Also adds the option of a logo
in the pillarbox,
like this wonderful example.

Compatible with Netflix

Many complaints:
Netflix currently stretches 4:3 shows

to 16:9.

Of course!

Compatible with Amazon Prime Direct

Per the official documentation,
4:3 1440×1080 is not supported
(by omission),
although 4:3 640×480 is.

I have not tested it personally.

Of course!

Set top boxes and HDTV sets from a USB memory stick.

It varies. Some do.
Others misinterpret it as
anamorphic and stretch it to 16:9.

Of course!
Also adds the option of a logo
in the pillarbox.

 

Great example of 1440×1080 (4:3) on YouTube

Despite the 16:9 thumbnail, the above video’s YouTube stats indicate that it is a 1440×1080. When you play it, you’ll see that it’s 4:3.

 

Wonderful example of 4:3 (1440×1080) on a 1920×1080 canvas on YouTube

4:3 programs upscaled to HD or 4K: technical & aesthetic decisions 3

The above video has a Norman Lear Effect logo in the canvas, in the pillarbox.

 

Feasibility of getting a deliverable file at 4:3 (1440×1080) w/square pixels

As I have explained, there are many advantages and disadvantages of delivering either 4:3 content on top of a wider 1920×1080 canvas with pillarbox… versus delivering a pure 4:3 (1440×1080) file with square pixels, depending upon the delivery venue. If you would like to use a pure 4:3 (1440×1080) square pixel file, hopefully your software will be able to fulfill your wish and deliver it properly, without ruining it.  However, if your software (like many) won’t give you that and you are forced to create an intermediary file with 4:3 content on top of a wider 1920×1080 canvas, then you may consider using the Apple Photos app on macOS to crop that intermediary video file properly and as desired. At first, it might seem wrong to do this with the Apple Photos app on macOS for video files, but Gary from MacMostVideo will prove that it is quite correct with the following video!

4:3 programs upscaled to HD or 4K: technical & aesthetic decisions 4

At 3:26 of the above video, you will see that there is even a 4:3 cropping preset in Apple Photos for macOS to facilitate this process. (Gary mentions the square cropping preset, but the 4:3 cropping preset is also visible there.) If you click on the 4:3 cropping preset, your cropping tool will be locked to that aspect ratio to make it extremely simple.

If you know of any similar app that does true video cropping for Linux or Windows (truly changing the dimensions), please comment its name below this article.

 

Related articles

 

(Re-)Subscribe for upcoming articles, reviews, radio shows, books and seminars/webinars

Stand by for upcoming articles, reviews, books and courses by subscribing to my bulletins.

In English:

En castellano:

Most of my current books are at books.AllanTepper.com, and also visit AllanTepper.com and radio.AllanTepper.com.

FTC disclosure

RØDE has not paid for this article. RØDE has sent Allan Tépper units for review. Some of the manufacturers listed above have contracted Tépper and/or TecnoTur LLC to carry out consulting and/or translations/localizations/transcreations. So far, none of the manufacturers listed above is/are sponsors of the TecnoTurBeyondPodcastingCapicúaFM or TuSaludSecreta programs, although they are welcome to do so, and some are, may be (or may have been) sponsors of ProVideo Coalition magazine. Some links to third parties listed in this article and/or on this web page may indirectly benefit TecnoTur LLC via affiliate programs. Allan Tépper’s opinions are his own. Allan Tépper is not liable for misuse or misunderstanding of information he shares.

]]>
https://www.provideocoalition.com/43-programs-upscaled-to-hd-or-4k-technical-aesthetic-decisions/feed/ 1
After Effects News December 2024 https://www.provideocoalition.com/after-effects-news-december-2024/ https://www.provideocoalition.com/after-effects-news-december-2024/#respond Wed, 01 Jan 2025 06:54:45 +0000 https://www.provideocoalition.com/?p=287466 Read More... from After Effects News December 2024

]]>
Adobe’s AI gumbo at Max 2024 10After Effects 25.1 is the current version, with no big recent announcements. The December 2024 (25.1) release includes important fixes, and Adobe listed remaining Known issues in After Effects.

New features include:

  • Accepts Lights switch for 3D layers
  • 3D model preview thumbnail
  • Cinema 4D 2025 upgrade in After Effects

After Effects 25.2 is in Beta; current features include:

  • Customize Transparency Grids
  • Customize panel background colors
  • Change anchor point in a more efficient way

 

After Effects News December 2024 7
PVC staff posted Looking back on 2024 and ahead to 2025 – A PVC Roundtable Discussion. After Effects specialist Chris Zwar bucked the crowd source opinion on HDR, saying that the spread of HDR & ACES has helped After Effects become increasingly accepted as a tool for high-end VFX. Unfortunately, HDR adoption is slow.

Chris shared related information in his video article Color Management Part 25: Corporate Brand Colors and ACES.After Effects News December 2024 8

 

Envato Video looked at Motion Design Trends for 2025, hosting four designers to learn what’s inspiring them, what’s driving them crazy, and how motion graphics designers can stay at the top of their game.

 

The School of Motion yearly summary is up, links and all. See the sample of 15 minutes, but it’s useful to catch the entire 6 1/2 hours. Here’s Highlights from our EPIC 2024 Year-End Roundup Podcast! Who are those pixies working for? Could it be…..?

 

 

Jake In Motion posted 2 substantial tutorials in December: Advanced CC Ball Action Techniques /// After Effects Tutorial and Advanced Grunge Texture /// After Effects Tutorial.

 

 

Noble Kreative used CC Ball Action for 3D Procedural Audio Visualizer in After Effects and Procedural Fiery Flames and Wavy Backgrounds in After Effects.

 

 

Film Riot posted Insane Asteroid Impact in After Effects. They heavily massaged stock footage to get there.

 

 

ukramedia is back with The After Effects Marker Trick Nobody Talks About! See how to control multiple elements across After Effects compositions using a single marker. You can set up markers to manage text, colors, and other properties efficiently with expressions, to simplify updating multiple items at once, saving time on complex projects. By automating changes with markers, you can streamline your workflow and avoid repetitive manual adjustments across compositions.

 

 

Stephan Zammit makes headway on creating 3D glass in Realistic 3D Motion Graphics in After Effects Advanced Tutorial.

 

 

One of several recent tutorials on the theme is Premiere Gal’s AWESOME Futuristic HUD Effects in After Effects.

 

 

Boone Loves Video adds a quick 3D airplane for a map animation with Working with 3D Models in Adobe After Effects 2025! 🗺.

 

 

After Effects Basics shared 10 After Effects HACKS Every Motion Designer NEEDS!

Still going strong, Eran Stern upped the ante with 10 Ae Techniques You’ve Never Seen!

 

 

And, Texturelabs is back with 21 PHOTOSHOP TIPS – Easy Through Advanced! Meanwhile, in dekeNow’s 7 Techniques That Will Change How You Use Photoshop Forever, the first tip (Auto Color Options) immediately impressed.

 

 

Check out ObjPop Beta from the Plugin Play toolset for AE in Instant Image to 3d model in After Effects. There’s already 3D animations from video footage elsewhere, but this is inside AE.

 

 

If you’ve dragged your feet on this, Default Cube says It’s Time For Gaussian Splatting // Tutorial. That one uses Blender, but there’s more on Gaussian splats in AE in After Effects Roundup October 2024.

 

 

Javier Mercedes shows you how to Reveal Text From Behind Object (Premiere Pro and After Effects).

 

 

Adobe Spectrum is the name for the new UI style in Creative Cloud. VideoRevealed has the justification for Premiere Pro (without nagging details) in Spectrum – Adobe’s Design System!

 

 

VFX and Chill hosted Talking Animals in After Effects (with JASON MURPHY). From talking dogs to practical puppets to CGI aliens, they chat about how Jason Murphy found himself directing movies full of so many visual effects shots. “How did After Effects fit into a production pipeline?”

 

 

Josh Toonen explains Why I Quit After Effects (to work in Hollywood VFX). You’re supposed to delete AE and learn Nuke for free (course fees not included). But really, you can use Nuke for free and, apparently, Unreal Engine is useful.

]]>
https://www.provideocoalition.com/after-effects-news-december-2024/feed/ 0
Looking back on 2024 and ahead to 2025 – A PVC Roundtable Discussion https://www.provideocoalition.com/looking-back-on-2024-and-ahead-to-2025-a-pvc-roundtable-discussion/ https://www.provideocoalition.com/looking-back-on-2024-and-ahead-to-2025-a-pvc-roundtable-discussion/#respond Wed, 01 Jan 2025 04:00:23 +0000 https://www.provideocoalition.com/?p=287445 Read More... from Looking back on 2024 and ahead to 2025 – A PVC Roundtable Discussion

]]>
Looking back on 2024 and ahead to 2025 – A PVC Roundtable Discussion 20

While the hosts of the Alan Smithee podcast already discussed the evolving landscape in media and entertainment as 2024 draws to a close, there’s so much more to say about what happened in 2024 and what 2025 has in store for individual creators and the entire industry. Generative AI for video is everywhere, but how will that pervasiveness impact actual workflows? What were some of the busts in 2024? How did innovations in cameras and lenses make an impact? And what else are we going to see in 2025 beyond the development of AI animation/video tools?

Below is how various PVC writers explored those answers in a conversation took shape over email. Keep the conversation going in the comments or on LinkedIn. 

 

Looking back on 2024 and ahead to 2025 – A PVC Roundtable Discussion 21Scott Simmons

2024? Of course, the first thing you think about when recapping 2024 (and looking ahead to 2025) is that it was all about artificial intelligence. Everywhere you look at technology, media creation, and post-production, there is some mention of AI. The more I think about it, though, the more I feel like it was just a transitional year for AI. Generative AI products feel like old hat at this point. We’ve had “AI” built into our editing tools for what feels like a couple of years now. While we have made a lot of useful advancements, I don’t feel like any earth-shattering AI products shipped within any of our editing tools in 2024. Adobe’s generative extend in Premiere Pro is probably the most useful and most exciting AI advancement I’ve seen for video editors in a long time. But it’s still in beta, so Gen Extend can’t count until it begins shipping. The AI-based video searching tool Jumper shipped is a truly useful third-party tool. Adobe also dropped their “visual search” tool within the Premiere beta, so we know what’s coming next year as well, but it’s far from perfect at the time of this writing, and still in beta. I appreciate an AI tool that can help me search through tons of media, but if that tool returns too many results, I’m yet again flooded with too much information.

The other big AI-based video advancement is text-based generative video coming into its own this year. Some products shipped, albeit with quite a high price to make it truly usable. And even more went into preview or beta. 2025 will bring us some big advancements in generative video and that’s because we’re going to need them. What we saw shipping this year was underwhelming. A few major brands released AI-generated commercial spots or short films, and they were both unimpressive and creepy. I saw a few Generative AI short films make the rounds on social media, and all I could do after watching them was yawn. The people that seemed most excited by generative video (and most trumpeting its game-changing status) were a bunch of tech bros and social media hawks who didn’t really have anything to show other than more yelling on social media from their paid and verified accounts or their promoted posts.

Undoubtedly, AI will continue to infiltrate every corner of media creation. And if it can do things like make transcription more accurate or suggest truly usable rough cuts, then I think we can consider it a success. But for every minor workflow improvement that is actually useful in post-production, we’ll see two or three self-proclaimed game-changing technologies that end up just being … meh.

In the meantime, I’ll happily use the very affordable M4 Mac mini in the edit suite or a powerful M4 Mac Studio that speeds up the post-production process overall. We can all be thankful for cloud-based “hard drives,” like LucidLink, that do more for post-production workflow than most AI tools that have thrown our way. Maybe 2025 will be the year of AI reality over AI hype.

19-6-4-nikki-cole-backpack-headshotNikki Cole

While I’m aware that the issues we’ve been facing on the writing/producing/directing side don’t affect many of the tech teams on the surface, it has been rather earth-shattering on our side of things. Everyone fears losing their jobs to AI and with good reason. I am in negotiations with a European broadcaster who really likes the rather whacky travel series I’m developing but, like so many b’casters now, simply don’t have enough cash to fully commission it. They flat out told me that I would write the first episode, and they would simply feed the rest of the episodes into AI so they wouldn’t have to pay me to write more eps. I almost choked when they said that matter of factly, and I responded by saying that this show is comedy and AI can’t write comedy! Their response was a simple shrug of the shoulders. Devastating for me and with such an obvious lack of integrity on their part, I’m now concerned that they are currently going ahead with that plan and we don’t even have a deal in place. So, from post’s perspective, that is one more project that isn’t being brought into the suite because I can’t even get it off the ground to shoot it.

As members of various guilds and associations, we are all learning about the magnificent array of tools, we now have at our fingertips for making pitches, sizzle reels, visual references, etc. It really is astonishing what I now can do. I’m learning as much as I can while I just can’t shake the guilt knowing that I’ll be putting the great graphic designers I used to hire out of work. If budgets were a little better, I would of course, hire a graphic designer with those skills, but as things stand today, I can’t afford to do that.

It’s definitely a fascinating and perplexing time!

Looking back on 2024 and ahead to 2025 – A PVC Roundtable Discussion 22Oliver Peters

AI in various forms gets the press, but in most cases it will continue to be more marketing hype than anything else. Useful? Yes. True AI? Often, no. However, tools like Jumper can be quite useful for many editors. Although, some aspects, like text search, have already existed for years in PhraseFind (Avid Media Composer).

There are many legal questions surrounding generative AI content creation. Some of these issues may be resolved in 2025, but my gut feeling is that legal claims will only just start rolling for real in this coming year. Vocal talent, image manipulation, written scripts, and music creation will all be areas requiring legal clarification.

On the plus side, many developers – especially video and audio plugin developers – are using algorithmic processes (sometimes based on AI) to combine multiple complex functions into simple one-knob style tools. Musik Hack and Sonible are two audio developers leading the way in this area.

One of the less glitzy developments is the future (or not) of post. Many editors in major centers for film and TV production have reported the lack of gigs for months. Odds are this will continue and not reverse in 2025. The role for (or even need for) the traditional post facility is being challenged. Many editors will need to find ways to reinvent themselves in 2025. As many business are enforcing return-to-office policies, will editors find remote work to be less acceptable to directors?

When it comes to NLEs, Adobe Premiere Pro and Avid Media Composer will continue to be the dominant editing tools when collaboration or project compatibility is part of the criteria. Apple Final Cut Pro will remain strong among independent content creators. Blackmagic Design DaVinci Resolve will remain strong in color and finishing/online editorial. It will also be the tool for many in social media as an alternative to Premiere Pro or Final Cut Pro.

The “cloud” will continue to be the big marketing push for many companies. However, for most users and facilities, the internet pipes still make it impractical to work effectively via cloud services with full resolution media in real time. Of course, full resolution media is also getting larger and not lighter weight. So that, of course, is not conducive to cloud workflows.

A big bust in this past year has been the Apple Vision Pro. Like previous attempts at immersive, 3D, and 360-degree technologies, there simply is no large, sustainable, mass market use case for it, outside of gaming or special venues. As others have predicted, Apple will likely re-imagine the product into a cheaper, less capable variant.

Another bust is HDR video. HDR tools exist in many modern cameras and even smart phones. HDR is a deliverable for Netflix originals and even optional for YouTube. Yet the vast amount of content that’s created and consumed continues in good old Rec 709. 2025 isn’t going to change that.

2025 will be a year when the rubber meets the road. This is especially true with Adobe, who is adding generative AI for video and color management into Premiere Pro. So far, the results are imperfect. Will it get perfect in 2025? We’ll see.

Looking back on 2024 and ahead to 2025 – A PVC Roundtable Discussion 23Iain Anderson

The last twelve months have brought a huge amount of change. Generative AI might have had the headlines, but simple, short clips don’t just magically scale up to a 30-second spot, nor anything longer. One “insane, mind-blowing” short that recently became popular couldn’t even manage a consistent clothing for its lead, let alone any emotion, dialogue or plot. Gen AI remains a tool, not a complete solution for anything of value.

On the other hand, assistive AI has certainly grown a little this year. Final Cut Pro added automatic captioning (finally!) and the Magnetic Mask, Premiere Pro has several interesting things in beta, Jumper provides useful visual and text-based media search today, Strada looks like doing the same thing soon in the new year and several other web-based tools offer automatic cutting and organizing of various kinds. But I suspect there’s a larger change coming soon — and it starts with smarter computer-based assistants.

Google Gemini is the first of a new class of voice-based AI assistants which you can ask for help while you use your computer, and a demo showed it (imperfectly) answering questions about DaVinci Resolve’s interface. This has many implications for anyone learning complex software like NLEs, and as I make a chunk of my income from teaching people that, it’s getting personal. Still, training has been on the decline for years. Most people don’t take full courses, but just jump in and hit YouTube when they get stuck. C’est la vie.

While assistant AIs will become popular, AIs will eventually control our computers directly, and coders can get a taste of this today. Very recently, I’ve found ChatGPT helpful for creating a small app for Apple Vision Pro, for writing scripts to control Adobe apps, and also for converting captions into cuts in Final Cut Pro, via CommandPost. Automation is best for small, supervised tasks, but that’s what assistants do.

Early in 2025, an upgraded Siri will be able to directly control any feature that a developer exposes, enabling more complex interactions between apps. As more AIs become able to interpret what they see on our screens, they’ll be able to use all our apps quicker than we can. In video production, the roles of editor and producer will blur a little further, as more people are able to do more tasks without specialist help.

But AI isn’t the whole story here, and in fact I think the biggest threat to video professionals is that today, not as many people need or want our services. High-end production stalled with the pandemic and many production professionals are still short of work. As streaming ascends (even at a financial loss) broadcast TV is dying worldwide, with flow-on effects for traditional TV advertising. Viewing habits have changed, and will keep changing.

At the lower end, demand for quick, cheap vertical social media video has cut into requests for traditional, well-made landscape video for client websites or YouTube. Ads that look too nice are instantly recognised as such and swiped away, leading to a rise in “authentic” content, with minimal effort expended. It’s hard to make a living as a professional when clients don’t want content that looks “too professional”, and hopefully this particular pendulum swings back around. With luck, enough clients will realise that if everyone does the same thing, nobody stands out.

Personally, the most exciting thing this year for me is the Apple Vision Pro. While it hasn’t become a mainstream product, that was never going to happen at its current high price. Today, it’s an expensive, hard-to-share glimpse into the future, and hopefully the state-of-the-art displays inside become cheaper soon. It’ll be a slow road, and though AR glasses cannot bring the same level of immersion, they could become another popular way to enjoy video.

In 2024, the Apple Vision Pro was the only device to make my jaw drop repeatedly, and most of those moments have come from great 3D video content, in Immersive (180°) or Spatial (in a frame) flavors. Blackmagic’s upcoming URSA Cine Immersive camera promises enough pixels to accurately capture reality —  8160 x 7200 x 2 at 90fps — and that’s something truly novel. While I’m lucky to have an Apple Vision Pro today, I hope all this tech is in reach of everyone in a few years, because it really does open up a whole new frontier for us to explore.

P.S. If anyone would like me to document the most beautiful places in the world in immersive 3D, let me know?

Looking back on 2024 and ahead to 2025 – A PVC Roundtable Discussion 24Allan Tépper

In 2024, we saw more innovation in both audio-only and audio-video switchers/mixers/streamers and the democratization of 32-bit float audio recording and audio DSP in microphones and mixers. I expect this to continue in 2025. In 2024, both Blackmagic and RØDE revolutionized ENG production with smartphones. In 2024, I began my series about ideal digitization and conversion of legacy analog color-under formats including VHS, S-VHS, 8mm, Hi-8mm and U-Matic. I discussed the responsibility of proper handling of black level (pedestal-setup/7.5 or zero IRE) at the critical analog-to-digital conversion moment, proper treatment and ideal methods to deinterlace while preserving the original movement (temporal resolution). That includes ideal conversion from 50i to 50p or 59.94i to 59.94p as well as ideal conversion from non-square pixels to square pixels, upscaling to HD’s or 4K’s vertical resolution with software and hardware, preservation of original 4:3 aspect ratio (or not), optional cropping of headswitching, noise reduction and more. All of this will continue in 2025, together with new coverage of bitcoin hardware wallets and associated services.

In 2024, at TecnoTur we helped many more authors do wide distribution of their books, ebooks and audiobooks. We guided them, whether they wanted to use the author’s own voice, a professional voice or an AI voice. We  produced audiobooks in Castilian, English and Italian. We also helped them to deliver their audiobooks (with self distribution from the book’s own website) in M4B format with end-user navigation of chapters. I expect this to expand even more in 2025.

Looking back on 2024 and ahead to 2025 – A PVC Roundtable Discussion 25Brian Hallett

This past year, we saw many innovations in cameras and lenses. For cameras, we are now witnessing the market begin the movement to make medium-format acquisition easier for “everyday filmmakers” rather than just the top-tier. Arri announced its new ALEXA 265, a digital 65mm camera designed to be compact and lightweight. The new ALEXA 265 is one-third the size of the original ALEXA 65 while slightly larger than the still-new ALEXA 35. Yet, the ALEXA 265 is only available as a rental.

Regarding accessibility for filmmakers, the ALEXA 265 will not be easier to get one’s hands on; that will be reserved for the Blackmagic URSA CINE 17K 65. The Blackmagic URSA CINE 17K 65 is exactly the kind of camera Blackmagic Design and CEO Grant Petty wants to get into the hands of filmmakers worldwide. Blackmagic Design has a long history of bringing high-level features and tools to cameras at inexpensive prices. It is the company that bought DaVinci Resolve and then gave it away for free when purchasing a camera. They brought raw recording to inexpensive cameras early on in the camera revolution. Now, Blackmagic Design sees 65mm as the next feature reserved for the top-tier exclusive club of cinematographers they can deliver to everyone at a relatively decent price of $29,995.00, so expect to see the Blackmagic URSA CINE 17K 65 at rental houses sooner than later. I also wouldn’t let the 17K resolution bother you too much. Blackmagic RAW is a great codec that is a breeze to edit compared to processor-heavy compressed codecs.

We also saw Nikon purchase RED but have not seen the cross-tech innovation between those companies. In years to come, we will see Nikon add RED tech to its Nikon cameras and vice versa.

Sony delivered the new Sony BURANO and I’m seeing the camera at rental houses now. More so, though, I see more owners/operators with the BURANO than anything else. It appears Sony has a great camera that will last a long time for owners/operators.

I feel like I saw a ton of new lenses in 2024, from ARRI’s new Enso’s Primes to Viltrox’s anamorphic lenses. We see Chinese lenses coming in from every direction, which is good. More competition benefits all of us and keeps prices competitive. Sigma delivered their new 28-45mm f/1.8, the first full-frame F/1.8 maximum aperture zoom lens. I tested this lens on a Sony mirrorless, and it felt like the kind of lens you can leave on all day and have everything covered. The depth of field was great in every shot. Sigma has delivered a series of lenses for mirrorless E-Mount and L-Mount cameras at an astounding pace from 500mm down to 15mm.

Canon was miserly with their RF-mount. To me, Canon is protecting its investment into its lens innovation by restricting who can make RF mount lenses. I wish they wouldn’t do such a thing. It can be counter-intuitive to me to block others from making lenses that work on their new cameras. What has happened is that all those other lens makers are making lenses specific to E-Mount and L-Mount. In essence, if you are a BURANO shooter, you have more lenses available than a Canon C400 shooter. The story I tell myself is if I had to buy a camera today, which lenses I could use would be a part of that calculus.

On Artificial Intelligence, AI, we cannot discount how manufacturers use it to innovate quicker and shorten the timeframe from concept to final product faster while saving money. As a creative, I use AI and think of AI in this way: there will be creatives who embrace AI and those who don’t or won’t, and that will be the differentiator in the long run. I already benefit from AI with initial script generation, which is only a starting point, to caption generation and transcription and to using it in Lightroom for photo editing.

Looking back on 2024 and ahead to 2025 – A PVC Roundtable Discussion 26Damien Demolder

The production of high quality video used to be restricted by the cost of the kit needed and the skills required to operate that equipment. Those two things helped to regulate the number of people in the market. The last year though has seen a remarkable acceleration in the downward pricing trend of items that used to cost a lot of money, as well as an increase in the simplification and convenience of their handling. I tend to review equipment at the lower end of the price scale, an area that has seen a number of surprising products in the last twelve months. These days, anamorphic lenses are almost common-place hovering around the $1000 price point, and LED lights have simultaneously become cheaper and much more advanced.

Popular camera brands that until recently only dipped their toe into the video market now offer their own Log profiles, encourage their users to record raw footage to external devices and provide full 35mm frame recording as though it is the expected norm. LUTs can be created on a mobile phone app and uploaded to cameras to be baked in to the footage, and care-free 32bit float audio can be recorded directly to the video soundtrack for a matter of a few hundred dollars and a decent mic. Modern image stabilisation systems, available in very reasonably priced mirrorless cameras, mean we can now walk and film without a Steadicam, and best-quality footage can be streamed to a tiny SSD for long shoots and fast editing. Earlier this year I reviewed a sub-$500 Hollyland wireless video transmitter system that, with no technical set-up, can send 4K video from an HDMI-connected camera to four monitors or recorders – or to your phone where the footage can be recorded in FHD. I also reviewed the Zhiyun Molus B500 LED light that now provides 500W worth of bi-coloured illumination for less than $600, and the market is getting flooded with small, powerful LED, bi- and full colour, lights that run on mini-V Lock batteries – or their own internal rechargeable batteries.

Now, a lot of these products aren’t perfect and have limitations, but no sooner have the early adopters complained about the faults in the short-run first batch than the manufacturers have altered the design and fixed the issue to make sure the second phase will meet, and often exceed, expectations. We can now have Lidar AF systems for manual lenses, autofocus anamorphics, cheap gimbals so good you’d think the footage was recorded with a drone – even lighting stands, heavy duty tripods and rigging gear are getting cheaper and better at the same time.

Of course all this is great news for low-budget productions, students and those just starting out, but it also means anyone can believe they are a film maker. With the demand for video across social media platforms and websites ever increasing you’d think that would be great news for the industry, but much of that demand is being eaten up by those with no formal learning and some clever kit. Not all the content looks or sounds very good, but often that matters less than that a tiny budget is kept to. Those who think they are film makers can easily convince those who can’t imagine what their film could look like, that they are.

I expect 2025 will bring us more of this – better, more advanced and easier-to-use kit at lower prices, and more people using it. I didn’t need to consult my crystal ball for this prediction. Every year has brought the same gradual development since I joined the industry 28 years ago, but once again it has taken us to places I really hadn’t expected. I expect to be surprised again.

Looking back on 2024 and ahead to 2025 – A PVC Roundtable Discussion 27Nick Lear

I found 2024 to involve a lot of waiting. Waiting for the industry to right itself, waiting for the latest AI tool to come out of Beta, waiting for companies to put stability before bells and whistles. That, I fear, may be rather a long wait. I also found I had very mixed feelings about AI – on the one hand I was excited to see what advances technology could bring and on the other hand saddened by the constant signs that profit is put before people – whether in plagiarising artists’ work or the big Hollywood studios wanting the digital rights of actors.

Generative AI impresses me whenever I see it – and I think we have to acknowledge the rate of improvement in the last few years – but I also struggle to see where it can fit in my workflow. I am quite looking forward to using it in pre-production – testing out shots before a shoot or while making development sizzles. To that end, it was great to see Open AI’s text to video tool Sora finally come into the public’s hands this month, albeit not in the UK or Europe. Recently Google’s Veo 2 is getting hyped as being much more realistic, but it’s still in Beta and you have to live in the US to get on the waiting list. Adobe’s Firefly is also waitlist only – so there’s more waiting to be done – yet it could well be that 2025 brings all of these tools into our hands and we get to see what people will really do with them outside of a select few.

On the PC hardware front, marketing teams went into overdrive this year to sell us on new “AI” chips. Intel tried to convince us that we needed an NPU (neural processing unit) to run machine learning operations when there were marginal gains over using the graphics card we already had. And Microsoft tried to push people in the same direction – requiring specific new hardware to qualify for Copilot+. Both companies are trying to catch up with Apple on battery life, which I’m all for, but I wish they could be more straightforward about how they presented it.

I continued to get a lot out of the machine learning based tools, whether it was using a well trained voice model in Eleven Labs or upscaling photos and video with Topaz’s software. I also loved the improvements that Adobe made in their online version of Enhance Speech which rescued some bad audio I had to work with. Some of these tools are starting to mature – they can make my life easier and enable me to present better work to my clients which is all I want at the end of the day.

Looking back on 2024 and ahead to 2025 – A PVC Roundtable Discussion 28Jeff Foster

For me 2024 was met with lots of personal life challenges which has precluded me from a lot of deep dive involvement into the world of AI to the level I did in the previous 2 years, but I did manage to catch up on some of the more advanced generative AI video/animation tools to explore and demo for the CreativePro Design + AI Summit in early December. I created the entire 40 minute session with Generative AI tools including my walking intro chat and the faux Ted Talk presentation, using tools like HeyGen, ElevenLabs, Midjourney and Adobe After Effects. As usual, I did a complete breakdown of my process as a reveal toward the end of my session and will be sharing this process in a PVC exclusive article/video in January 2025.

The rate of development of AI animation/video tools such as Runway, Hailuo, Leonardo and others that are developing their text/image to video tools is astounding. I think we’re going to see a major shift in development in this area in 2025.

I’m also exploring music and audio AI generative tools including hardware/software solutions in the coming months and expect to see some amazing growth in quality, production workflows and accessibility to the public who are virtually non-musicians.

As usual, I’m only exploring the tools and seeing how they can be utilized, but also am concerned for the direction all of this is heading and how it affects content creators and producers in the end. I always take the position that these are tools we can utilize in our workflows (or at least have an understanding of how they work) or choose to ignore them and hope they’ll just go away like other fads… which they won’t.

Looking back on 2024 and ahead to 2025 – A PVC Roundtable Discussion 29Michelle DeLateur

Christmas morning is a flurry of impatience; usually an explosion of expectation, surprise, and wrapping paper scraps. I used to describe NAB as “camera Christmas”  to newcomers. But with announcements coming by email and NAB turning into primarily events, meetings, and conversations, that giddy elf feeling I used to have to see the new floor models has turned into excitement to see familiar faces.

So where has our impatience shifted? It seems we now find ourselves in a waiting game for presents in 2025.

That new Adobe Gen-AI video model? Hop on the waitlist. Hoping to see more content on the Vision Pro, perhaps with the Blackmagic URSA Cine Immersive? Not yet. Excited about Sigma making RF lenses? They started with APS-C first.

Patience is not one of our virtues. With shorter camera shelf lives and expected upgrade times, we assume we will hold onto our gear shorter than ever and are always ready for a change. Apple’s releases make our months-old laptops seem slow. A new AI tool comes out nearly every day.

Video editors are scrambling for the few job openings, adding to their skill sets to be ready for positions, or transitioning to short term jobs outside of video alongside the anxiety of AI threatening to take over this realm. We rejoiced when transcription was replaced by a robot. We winced when an AI program showed it could make viral-ready cuts.

Just because we are forced to wait does not mean we are forced to be behind. It is cheaper than ever to start a photography journey. Mastering the current tools can make you a faster editor. Teaching yourself and others can help create new stories. While I personally don’t fully believe in the iPhones filmmaking abilities, there ARE plenty of tools to turn the thing-that’s-always-on-you into a filmmaking device.

In 2024, we were forced to wait. But we are not good at waiting. That’s the same tenacity and ambition that makes us good at storytelling. It’s only a matter of time. It’s all a matter of time. So go forth and use your time in 2025.

Looking back on 2024 and ahead to 2025 – A PVC Roundtable Discussion 30Chris Zwar

For me, 2024 was a case of ’the more things change, the more they stay the same’. I had a busy and productive year working for a range of clients, some of whom I hadn’t worked with for many years. It’s nice to reconnect with former teams and I found it interesting that briefs, pitches and deliveries hadn’t changed a great deal with time.

The biggest change for me was finally investing in a new home workstation. Since Covid, I have been working from home 100%, but I was using an older computer that was never intended for daily projects. Going through the process of choosing components, ordering them and then assembling a dedicated work machine was very rewarding, and something I should have done sooner. Now that my home office has several machines connected to a NAS with 10-gig ethernet, I have more capacity at home than some of the studios I freelance for – something I would have found inconceivable only a few years ago!

Technically, it seems like the biggest impact that AI has had so far has been providing a topic for people to write about. Although AI tools continue to improve, and I use Ai-based tools like Topaz & Rotobrush regularly, I’m not aware of AI having had any impact on the creative side of the projects I’ve worked on.

From my perspective as an After Effects specialist, the spread of HDR & ACES has helped After Effects become increasingly accepted as a tool for high-end VFX. The vast majority of feature films and premium TV shows are composited with Nuke pipelines, but with ACES having been built-in to AE for over a year, I’m now doing regular cleanup, keying and compositing on projects in After Effects that wouldn’t have been available to me before.

]]>
https://www.provideocoalition.com/looking-back-on-2024-and-ahead-to-2025-a-pvc-roundtable-discussion/feed/ 0
Tool Tip Tuesday for Adobe Premiere Pro: What we’re most looking forward to in the next year in Premiere. https://www.provideocoalition.com/tool-tip-tuesday-for-adobe-premiere-pro-what-were-most-looking-forward-to-in-the-next-year-in-premiere/ https://www.provideocoalition.com/tool-tip-tuesday-for-adobe-premiere-pro-what-were-most-looking-forward-to-in-the-next-year-in-premiere/#respond Tue, 31 Dec 2024 15:19:18 +0000 https://www.provideocoalition.com/?p=287430 Read More... from Tool Tip Tuesday for Adobe Premiere Pro: What we’re most looking forward to in the next year in Premiere.

]]>
 

Tool Tip Tuesday for Adobe Premiere Pro: What we're most looking forward to in the next year in Premiere. 34

Welcome to Tooltip Tuesday for Adobe Premiere Pro on ProVideoCoalition.

Last week, Scott and Jeff showed what features they were most thankful for in the 2024 updates.

In this last post on New Year’s Eve, we are doubling up again on what we’re most excited about and looking forward to in the next year.

Much of the Adobe future is plotted out in their public betas available through the Creative Cloud application. You can install these concurrently with your existing version of Adobe apps. Both of us are updating the betas of Premiere Pro (and Media Encoder, Audition, Photoshop, and After Effects) on a daily basis

Tool Tip Tuesday for Adobe Premiere Pro: What we're most looking forward to in the next year in Premiere. 35

You can find a list of the What’s New in Beta by clicking on the Laboratory Beaker icon in the top right of Premiere Pro.

What's new in Beta

We don’t feel these versions are safe for production environments but rather allow you to play with the new tools early.

What is Jeff looking forward to?

This was a very hard choice for me.

I’m a hundred percent going to cheat and tell you that my list of what I’m excited about includes key functional Improvements (multi-threaded audio conforming, Open Timeline Import and Export (OTIO), and the New Color management) along with the potential of AI features (The new search panel with Visual Search, Generative Clip Extension and Smart Masking.)

But I have to pick one – and my criteria is to pick the one that I think will make the greatest difference in everyday editing. (This is my rule – Scott may pick a feature for different reasons!)

If Adobe can nail it, it’s going to be Generative Extend.

There are tricks we’ve learned when the footage is just too short. Generative Extend can add up to 2 seconds at the front and end of a clip. Any clip.

It could be a moment before an actor speaks. It could be just extending that pan just a little longer. It could be taking that small snippet of room tone and extending it. Generative Extend will handle audio with some key restrictions about music and speech.

This is the one feature I see myself using in every edit to get the timing of moments right to give me that extra beat that the source footage doesn’t have. Whether it’s for narrative or documentary work, I can’t wait to make this part of my everyday editorial.

Left image: original end frame, right image: extended "perfect" end frame.

Nevertheless, every single one of the features I mentioned above is crazy useful. This is the one that I keep finding myself wanting to go back to old edits to make adjustments.

The caveat will be how clients/audiences will feel knowing that the material will have some AI-generated content and the authenticity of editorial intention. But that’s a topic for a different discussion.

What is Scott looking forward to?

While I have to agree with Jeff that Generative Extend is a pretty amazing offering coming in the next year (Did you read about my Burning Questions about Generative Extend?), the particular item that I’m looking forward to is rather small in execution but large in functionality. I’m really looking forward to dynamic waveforms making their appearance in the shipping version of Premiere. Dynamic waveforms are now available in the Premiere beta, and they give you real-time feedback on the waveform scaling of your audio clips in the Premiere timeline.

This makes the most sense when you see them in action.

Tool Tip Tuesday for Adobe Premiere Pro: What we're most looking forward to in the next year in Premiere. 36

As you see in the gif above, as you adjust audio keyframe levels, the waveforms will scale up or down to reflect the volume change that you’ve made. And this doesn’t work with just “rubber band” key-framing. Dynamic waveforms are reflected in most places that you can change the volume in Premiere. Such as, the Essential Sound Panel, Effects Controls, the New Properties panel or applying clip-level audio gain. It’s simple, useful, very responsive and will be a much-welcome addition to Adobe Premiere Pro when it ships to the full release in 2025.

This series is courtesy of Adobe. 

]]>
https://www.provideocoalition.com/tool-tip-tuesday-for-adobe-premiere-pro-what-were-most-looking-forward-to-in-the-next-year-in-premiere/feed/ 0
HDMI 2.2 will be announced at CES 2025 https://www.provideocoalition.com/hdmi-2-2-will-be-announced-at-ces-2025/ https://www.provideocoalition.com/hdmi-2-2-will-be-announced-at-ces-2025/#respond Sat, 28 Dec 2024 13:09:47 +0000 https://www.provideocoalition.com/?p=287420 Read More... from HDMI 2.2 will be announced at CES 2025

]]>
HDMI 2.2 will be announced at CES 2025The announcement of a new specification for HDMI, offering higher bandwidth, will be made at CES 2025, and some believe the new NVIDIA RTX 50 series will include it in the specifications.

Information is scarce, but according to the information shared by the HDMI Forum, the new HDMI specifications – which will deliver higher bandwidths and resolutions – will be revealed during a press conference at CES 2025, on January 6, the same day that NVIDIA founder and CEO Jensen Huang will deliver a keynote at CES 2025. Huang will deliver his keynote Monday, January 6, at 6:30 p.m.

CES 2025, which will take place in Las Vegas from January 7-10, 2025, is the stage where NVIDIA will announce its new RTX series, Blackwell. There is some speculation that the new RX 50 series (starting with the RTX 5080 and RTX 5090) will already include HDMI 2.2 (if that’s the final name for the new HDMI version). At CES 2025, AMD is also introducing its new graphics card, which reports name as Radeon RX 8000, a GPU that may also use the new HDMI specification.

The announcement from the HDMI Forum indicates that the new version will support higher resolutions, refresh rates and enhanced transmission quality… and a new cable. The actual HDMI 2.1 transmits data at up to 48Gbps, allowing for refresh rates of up to 120Hz at 4K resolution. It is expected that HDMI 2.2 will compete with DisplayPort 2.1, which can reach 80Gbps.

]]>
https://www.provideocoalition.com/hdmi-2-2-will-be-announced-at-ces-2025/feed/ 0
Time to stop hating Avid Titler+! https://www.provideocoalition.com/time-to-stop-hating-avid-titler-plus/ https://www.provideocoalition.com/time-to-stop-hating-avid-titler-plus/#comments Sat, 28 Dec 2024 05:51:18 +0000 https://www.provideocoalition.com/?p=287379 Read More... from Time to stop hating Avid Titler+!

]]>
So, are we?  You know, done hating on Avid’s Titler+?  A quick history lesson on how we got here is that Apple updated their system architecture, making the standard Title Tool (and Marquee for that matter) useless for all Mac users.  Windows users, however, could still hold onto their beloved Title Tools for longer, but, I’ll ask the question.  If you were serious about making titles for your content, were you really using the legacy Title Tools?  Probably not.  You were probably using Continuum’s Title Studio (if you want to stay inside of Media Composer), or you’re just simply using After Effects, or having a third party create all of your titles.  Let’s be honest right out of the gate.  Avid really messed this one up.  They released a tool that was, well, terrible.  But, we had a problem.   Not only was the Title Tool and Marquee Title Tool end of life, but now Mac users had, really, no Title Tool to use and Avid needed to fix as quickly as possible.  Well, they’ve been talking about it, and they have been teasing it, and now finally, with the 2024.10 update to Media Composer, the long awaited, and much talked about “new” Titler+ was released, so in this article I thought I’d jump in and show you why you need to learn to stop worrying (and hating) and enjoy Titler+.

The first thing that’s important to keep in mind when getting yourself rolling with Titler+ (Titler+) is that it’s now more like titles in other applications.  Titles in Premiere and Resolve are just what you would call “Generators”, meaning that no actual physical media is being created, when adding titles to your timelines.  This is a different concept for Media Composer editors, who have been used to their titles always being actual media.  What this means for you is that if you remove a title from your timeline, it’s gone forever.  Can’t match frame it back, or go back to your bin to find it, so that’s something exceptionally important to keep in mind.  Second thing to keep in mind is that, since the titles are no longer clips to drop in your timeline, I highly recommend NOT adding titles to media/clips that currently live in your timeline.  If you want to add a title, you’re going to select the region you want to have the title appear in, add a couple of edits, and drop it in that way.  Now, with all of that said, there’s something else important that you’ll need to keep in mind when it comes to Titler+, and that is that since your title is not an actual piece of media anymore, you won’t be able to add effects to it like Continuum or Sapphire.  You get the title, and that’s it.  This, for me, is the real downside of Titler+ but, again, if you’re serious about your titling, you’re doing it all using Title Studio or After Effects anyways, so in the grand scheme of things, it’s more of an annoyance than anything.

Your “enjoyment” of Titler+ will vary based on which version of Media Composer you’re using.  With 2024.10 of the application, there has been a major enhancement and re-writing of the effect as it was very cumbersome to use in previous versions.  Now, I’m not going to list through all the updates in one shot.  Let’s start using the effect, and we’ll talk about updates as we go.

The easiest way to search for the effect is to simply head to the Effect Palette and in the search window, just hit “+”.  You’ll notice two effects appear.  The pre-2024.10 update version of the effect, as well as the updated version of the effect. 

Old and New Titler+

So, apparently you can’t update any older instances of Titler+ (but apparently no one was using it anyways), so it’s only there for legacy projects that happen to use it.  Anyways, don’t choose the “Legacy” version, only use the current one.  Once you take the effect and drag it and drop it into the space you’ve made in your timeline, the Titler+ dashboard, as well as the Effect Editor will open up and you’ll be ready to use Titler+.

Titler+ Dashboard

DASHBOARD VS EFFECT EDITOR

If you haven’t used Titler+ before, the dashboard that pops up when you apply the effect will be something a little different for most Media Composer editors.  We’re accustomed to only having these types of windows appearing when we tell Media Composer that we want them there.  Audio Tool, Audio Mixer, etc.  When it comes to your workflow inside of Titler+, most of us will probably want to stick with the Dashboard only (which we’ll talk more about a little later), but I want to point out that you can easily toggle the Dashboard on and off through the Effect Editor, by hitting the grid icon in the upper right corner.  I’ll say that you can do 90% of your work in either the Dashboard, or the Effect Editor, but at some point you will need to use both, if you plan on animating your titles.

Alright, let’s get back into it!  Once the windows open, nothing is really happening here.  You’re going to need a title to get yourself rolling.  So let’s create one.  With the Text Tool selected, simply click anywhere in the Sequence window to add your title.  Keep in mind that you’ll probably want to make sure that you’re viewing the top most layer in your timeline, so you see the text appear right away.

Simple Titler+ Title

What is important to keep in mind in Titler+, as opposed to how you titled before is that you can add as many titles to the frame that you need, and the Effect Editor will only show you the parameters of the title you have selected, not all of the titles at the same time.  You titles, even though there may be many of them, can always be selected, and animated (if you want), as individual elements, an not as one all encompassing title, as it has been in the past.  Now, there is a way to manipulate all the titles at the same time to animate them on and off, but we’ll discuss that later in the article when we talk about the “Foreground” parameter of our Title.

NEW ANCHOR POINT TOOL

If you know me, or my tutorials, you know we’re going with Impact, as it’s a good, thick font to start out with.  I want to mention, before we move any farther forward, that there really are two thought processes when working with Titler+.  You can work in the interface entirely, until you’re ready to animate (if you’re going to animate at all), or you can do a combination of working in the interface, and in the Effect Editor, which we’ll be doing in this article.  Once you have your first set of text input into the interface, you can now switch over to the Selection Tool (M) to reposition it.  With that said, here’s where we can start to find some simple, yet effective differences between the old Title Tool (TT) and Titler+.  You’ll notice, first of all, that there is a little crosshair sitting at the lower right corner of your text.  This is the Anchor Point, and it’s quickly adjustable in one of two ways.  If you want precision, you can move to the Effect Editor, to SELECTED OBJECTS>TRANSFORM>ANCHOR POINT, and adjust it’s position with a selection in the dropdown menu.

Titler+ Dynamic Anchor Point

Or, if you click on the actual Anchor Point located in the lower left corner of the text, you’ll notice the Selection Tool turn red, and now you’re adjusting the Anchor Point dynamically, and you can now position it anywhere you want it to go.

Titler+ More Dynamic Anchor Point

Now, with that said, there are a couple of issues with the way Anchor Point behaves.  For example,  First, when the text is first input in the interface, the Anchor Point is locked to the lower left corner of the text, not the bounding box, which makes perfect sense.

Titler+ Anchor Point Issue 1

However, if you head to the Effect Editor, and adjust the Anchor point to the “Bottom Left”, it puts at the corner of the bounding box, and not at the corner of the text that was input, which is inconsistent to the point where I find myself never using this way of setting it up.  The Anchor Point should always be relative to the text it’s affecting, not the bounding box the text sits in.  

Titler+ Anchor Point Issue 2

Speaking of the bounding box, I found it odd that there was so much space at the top and bottom of it, as opposed to on the left and right sides.  I’m of the feeling that the it really should have been set up like After Effects’ bounding box, which is tight to the text, for more precise functionality.

Titler+ Anchor Point Issue 3

The last thing that I think is important to talk about, when it comes to Anchor Point, is that when it’s adjusted via the Effects Editor, your text position will adjust with it.   By default, the Anchor Point is in the lower left corner, so the last thing you want to do is get everything setup and ready to go, only to realize that you now want to adjust your Anchor Point, as it will throw everything off.  It would be super helpful to have a User Setting that would let you pick the default Anchor Point behavior, but hey, this little engine has come pretty far from its humble beginnings, so I’ll keep my fingers crossed for an update that will add this feature in there, and as I said before, you can always adjust it dynamically, which won’t impact the text’s position.

MUCH BETTER TEXT SIZING AND SCALING

With the new, dynamic Anchor Point comes another mind bending feature for Media Composer editors, and that’s scaling.  Now, I know that sounds very odd, but what I mean by that is that what Media Composer editors are accustomed to, when it comes to issues like text size, is that once you have set the size of your text inside of the legacy TT, if you increase the scale of the text through the titles 3D Warp capabilities the larger you make it, the crappier it looks, due to the fact that the text is not vector based.  Well, those days are now gone.  Titler+ has the capability to not only set your text size pretty large (100 Avid point size – whatever that means), but you can then go in and adjust the scaling of the text up to a factor of 1000, keeping the text nice and crisp, along the way.

Titler+ Scaling
Look how crisp that text is at 100pt size and 1000% scale!
CONSISTENT COORDINATE SYSTEM

One thing that you’ll notice across applications like Media Composer, and even DaVinci Resolve is the way the Coordinate system is set up.  Unlike in After Effects, where the coordinates start in the upper left corner at 0,0 and the dead center of an HD 1080p frame is 960×540, Media Composer, across all its effects, has always treated the dead center of the frame as 0,0, and with the update to Titler+, this has now become the same for this effect, to match all the other standard effects, inside of the application.

WORKING WITH THE FOREGROUND

I talked about this concept earlier in the article, and I want to swing back and talk about the Foreground now.  I had mentioned that each title is its own element that can be selected and animated as an individual element in Titler+.  But, what do I do in a situation where I’d like to have all the titles fade on, or wipe on at the same time.  That would be a bit of a logistical nightmare trying to match up all the animations across the different titles.  Well, that’s where the Foreground parameter comes into play.  This is an interesting concept in Titler+ that I really like, and am hoping that will get a slight update to it in the future.  Here’s how the “Foreground” parameter works inside of Titler+.  All of the objects that sit in your frame populate the “Foreground” of the title, no matter what the stacking order is.  Let’s say that we want to have all of the elements fade on at the same time.  With either a title selected, or even no title selected at all, you can navigate to the “Foreground” section of the Effect Editor, twirl it down, and you’ll notice three parameters in there.  Fill Overlay, which controls the opacity of the “fill” portion of your title, the Opacity of your title(s) and a Crop parameter to have titles wipe on from the top, bottom, left or right.  I LOVE this feature except for one major omission, and that is that they didn’t include feather for the Crop parameter.  Foreground is a great, simple feature that is available when a title is selected and also available when nothing is selected, as it’s impacting the frame the titles sit in, more so than the titles themselves.  Fingers crossed for an update with crop feather in a future update.

A FEW OTHER FEATURES

I want to make sure that I mention a few other features that are included in the update to Titler+ which include better Font support meaning that you’ll have a better overall experience with your TrueType, OpenType and FreeType fonts in Titler+ as well as better cross platform compatibility.  To be honest, I’ve always found that the legacy Title Tool was pretty solid on it’s font support, so I would expect the same here in Titler+.  Tracking is included in Titler+ but, I’ll be 100% transparent.  I hate the tracker in Media Composer and would never use it, so if you like using it, you have access to it in Titler+, and last, but certainly not least, there is much more simplified workflow when it comes to Rolls and Crawls in Titler+.  You now have access to not only a linear time graph in the Effect Editor, but you can also scrub through your timeline in Edit mode to make sure that all of your crawl/roll layouts look exactly the way you need them to look.  Again, 100% tranparancy, I don’t do rolls and/or crawls in Media Composer, as there is no support for including logos, which is a deal breaker for me, so I do all my credits in After Effects, which gives me everything I need and much, much more. 

LET’S BUILD ONE!

Alright, so for kicks, let’s build a quick, good looking title inside of Titler+.

  1. Let’s select a seven second range of our timeline for our title to appear in.  One second fade up and out, five seconds for the content to be on screen.
  2. Head to the Effect Editor, punch in “+” in the search window and drag and drop Titler+ to the seven second hole that’s in our timeline
  3. Once the effect has been applied and the Titler+ Dashboard has appeared, if the Type Tool isn’t selected, select it, double click in your timeline and let’s add our first title.  Our Boxer needs a name, so we’ll enter the name “IVAN GRAGO”.  What’s always important to keep in mind is that Titler+ will always default to creating you a title that matches the look of the previous title you created.  If it was simple white with a black drop shadow, that’s what you get.  If it looked like the thumbnail of this article, that’s what it’s going to look like.

    Titler+ Ivan Grago 1
  4. Now, “IVAN GRAGO” is a little bit big.  We can use the Size parameter in the Dashboard to adjust it, or we can do something that I haven’t mentioned yet, and we can adjust it dynamically by holding SHFT and grabbing on of the corner bounding box points, and dragging the bounding box smaller.  This will keep the aspect of the title’s sizing the same, and just adjust it’s over scale.  If you wanted to adjust the X/Y scale independently of each other, you could hold OPT/ALT and drag to achieve that look.  
  5. Once I have the size correct, I’m going to move “IVAN GRAGO” towards the lower left corner of the frame, and much like in the legacy Title Tool, I’m going to use the rectangle tool to draw a rectangle around “IVAN GRAGO”, I’m going to add a blue outline to it, and then turn off the fill, so I only have a blue rectangular border around it. 

    Titler+ Ivan Grago 2
    What’s important to keep in mind is that stacking order is still important, much like in the legacy Title Tool. I now have a transparent rectangle on top of my text, so I can just quickly hit the “send to back” button on the dashboard to move the rectangle behind my text, so if I need to make an adjustment, I’m good to go.
  6. Now, a quick selection of both title elements and a Copy/Paste will give us a duplicate of what we have, and I’m going to update the text to say “Professional Boxer”, and use the technique we talked about in #4 to adjust the size of the text dynamically, give the outline a quick adjustment and we can now move that title below “IVAN GRAGO” to give us a very cool title look.

    Titler+ Ivan Grago 3
  7. Now, with that said, we have a bit of a problem.  The entire title is too big.  Well, again, much like in the legacy Title Tool, we can quickly group elements together and make them smaller, so they fit exactly where we need them to fit.  One thing that’s important to keep in mind is that the engineers at Avid have kept all the icons for things like Stacking, Grouping, Select Next Object, etc the same, so that your transition to Titler+ is as smooth as possible.

    Titler+ Ivan Grago 4
  8. Now that we have the Title placed where we want it to go, we can simply ungroup, head up to the Foreground parameter, and animate the titles fading in and out for 30 frames, and this title will be ready to go!Titler+ Ivan Grago 5

I’d say that was quick and simple, and it took me WAY longer to write that out, then it did for me to actually create it.

Let’s be honest.  Avid completely fumbled the ball on Titler+ when it was first released, and have had to play catchup getting it back up to where it needs to be.  Have they done it?  I’d say they have.  My biggest gripe is the fact that now I can’t use any third party effects on my titles, but that’s a very minor gripe that just means that for those specific titles, I’ll have to create them in After Effects, otherwise, good-bye legacy Title Tool and Marquee Title Tool.  It was fun while it lasted, but I’ll stick with Titler+ moving forward.

]]>
https://www.provideocoalition.com/time-to-stop-hating-avid-titler-plus/feed/ 2
Tool Tip Tuesday for Adobe Premiere Pro: What are we thankful for from updates this past year https://www.provideocoalition.com/tool-tip-tuesday-for-adobe-premiere-pro-what-are-we-thankful-for-from-updates-this-past-year/ https://www.provideocoalition.com/tool-tip-tuesday-for-adobe-premiere-pro-what-are-we-thankful-for-from-updates-this-past-year/#comments Tue, 24 Dec 2024 20:00:21 +0000 https://www.provideocoalition.com/?p=287163 Read More... from Tool Tip Tuesday for Adobe Premiere Pro: What are we thankful for from updates this past year

]]>
Tool Tip Tuesday for Adobe Premiere Pro: What are we thankful for from updates this past year 41

Welcome to Tool Tip Tuesday for Adobe Premiere Pro on ProVideo Coalition.

To wrap up 2024 and Tool Tip Tuesday for Adobe Premiere Pro on ProVideo Coalition, Jeff and Scott thought they would double up the last two weeks and the year with what are they thankful for from the past year of Adobe Premiere Pro releases and what are they most looking forward to next year.

What is Scott thankful for?

One of my favorite things that came along in the August 2024 update to Premiere is Linear Timecode support when creating Multicam Source Sequences. Many of us call Linear Timecode “audio timecode” and it is that awful sound you hear when a timecode generator pumps timecode to the audio channel of a camera that doesn’t support traditional timecode in signals.

But while the audible sound is horrible, it is, in actuality, it’s a glorious sound as it means you have timecode going to a camera. And if you hear it in one camera file that usually means it’s likely going to other cameras for easy mulitcam sync.

From Adobe’s new features support documents:

Linear Timecode support
With native Linear Timecode (LTC) support, you can quickly and accurately sync multiple video and audio sources from devices that encode timecode in the audio signal. You can use LTC to synchronize clips or create multi-camera source sequences.
LTC has become more popular and useful since many newer, more affordable devices don’t support recording timecode as metadata in the video file. A common solution is to store timecode data in an audio track. While this track is audible to the human ear, it’s a high-pitched sound that contains data instead of a “normal” sound. Many lower-cost timecode syncing devices can generate LTC.

How do you use Linear Timecode? It’s simple: If you have clips with Linear audible timecode, when you select those clips and choose “Create Multi-Camera Source Sequence,” there’s a new option for Linear Time Code.

Tool Tip Tuesday for Adobe Premiere Pro: What are we thankful for from updates this past year 42

Don’t confuse Linear Timecode with Sound Timecode – the sound timecode option has been there for awhile and has to do with high-end dual system sound workflows. Linear Timecode is a relatively inexpensive way to timecode sync cameras using external audio generators. Use them on your next multicam production, and your editor will thank you.

What is Jeff thankful for?

Skip Import Mode

There’s nothing better than a tiny fix that gets the unnecessary bits of the user interface out of the way.

This, though, hurt when it came out.

Tool Tip Tuesday for Adobe Premiere Pro: What are we thankful for from updates this past year 43 

I get why Adobe added this giant new import box. I hated it, but could work past it. After all, you shouldn’t be seeing this box anyway. Professionals should be working from a prebuilt template project with all the assets/named timelines already part of the project. 

Every now and again, I need to do a small new project – maybe it’s a scratch project, maybe it’s a project to just hold temporary elements. And the import box kills me.

In the October 2024 release, Adobe added this tickbox called Skip import mode.  

Tool Tip Tuesday for Adobe Premiere Pro: What are we thankful for from updates this past year 44

It does exactly what I want: it gets the Import out of the way. I can name a project, and get to work. Great for when I need one of these scratch projects. 

It’s like Adobe heard the experienced users and gave us a way to work around something built for new software adoptees. Thanks, Adobe.

This series is courtesy of Adobe. 

]]>
https://www.provideocoalition.com/tool-tip-tuesday-for-adobe-premiere-pro-what-are-we-thankful-for-from-updates-this-past-year/feed/ 2
Color Management Part 25: Corporate Brand Colors and ACES https://www.provideocoalition.com/color-management-part-25-corporate-brand-colors-and-aces/ https://www.provideocoalition.com/color-management-part-25-corporate-brand-colors-and-aces/#comments Sun, 22 Dec 2024 01:38:18 +0000 https://www.provideocoalition.com/?p=287290 Read More... from Color Management Part 25: Corporate Brand Colors and ACES

]]>
For certain sectors of the industry, working with corporate branding colors is a critical aspect of the job. In Part 24, I looked at how we can deal with corporate branding guidelines when working with regular Standard Dynamic Range projects. The key takeaway is that corporate colors are always defined using the sRGB colorspace, and we need to convert those RGB values accordingly if we are outputting to Rec.709.

Because Prores Quicktimes are always assumed to be Rec.709, ignoring color management can result in color and brightness shifts, often incorrectly blamed on a “Quicktime gamma bug”.

In this video, a direct follow-on from Part 24, I look at how we deal with sRGB branding colors when we are working with High Dynamic Range and ACES.

I’ve spent a fair bit of time pondering whether or not this is a niche topic. But having spent the previous week dealing with exactly the issues outlined in this video, it’s certainly going to become increasingly relevant.

This video is tying together several topics that have been covered previously.  Firstly, ACES was introduced back in Part 12, and it provides an industry standard for working with High Dynamic Range video. In Part 19 we introduced High Dynamic Range video, and then in Part 20 we looked at a number of scenarios that all demonstrate how and why working with HDR simply looks better. In Parts 22 & 23 we looked at how HDR video is tone mapped for Standard Dynamic Range outputs. Most recently, Part 24 emphasised that corporate brand colors are always defined using sRGB values, and we need to use color management so they remain accurate when they are output to Rec.709 Prores.

So what’s the problem with ACES?

The answer is partly to do with the state of the corporate industry in 2024. After Effects is a hugely popular application, but it’s dominant in the areas of the industry where High Dynamic Range and ACES are yet to make an impact. Every motion designer I know is still working with Standard Dynamic Range projects – that’s 8 or 16 bit, using sRGB or Rec.709. There are many areas where After Effects has a dominant market share – advertising, live events, product launches, conferences, large scale projections, broadcast promotions, POS displays and that’s before we even start to consider social media platforms and the massive amount of content uploaded every day. All of these areas are still using regular SDR video – often Prores Quicktimes for “masters” and MP4s for streaming.

The types of productions that are most likely to use After Effects are also the types of productions where corporate brand guidelines are critically important.

ACES was designed for large scale, high-end productions including feature films and premium TV shows.  I don’t think they had motion design at the top of the list when it was being developed, and I don’t think the average VFX artist needs to worry about RGB brand colors when they’re compositing Hollywood blockbusters.

But the technology for HDR production is here, and compositing using HDR & ACES just looks better. So After Effects users are going to see an increase in HDR and ACES productions – maybe not overnight, but certainly over the next few years. On a personal level, the majority of advertising projects I’ve worked on over the past 4 or possibly 5 years have been completed using ACES. It’s just a matter of time.

So tying all these previous threads together: it’s easy to make the decision to work in an ACES project, because it simply looks better. And we can composite in High Dynamic Range while rendering and delivering a Standard Dynamic Range Prores master. But this process will result in changes to the brightness of the final result, and this includes any branding colors used. Obviously branding colors need to be accurate in the final, delivered masters – and so here we are.

This is part 25 in a long series on color management.  If you’ve missed the other parts, you can catch up here:

Part 1: The honeymoon is over

Part 2: Newton’s Prisms

Part 3: It’s all in the brain

Part 4: Maxwell’s spinning discs

Part 5: Introducing CIE 1931

Part 6: Understanding the CIE 1931 chromaticity diagram

Part 7: Introducing gamma

Part 8: Introducing Colorspaces

Part 9: The theory of a color managed workflow

Part 10: Using After Effects built-in color management

Part 11: Introducing OpenColor IO

Part 12: Introducing ACES

Part 13: OpenColorIO and After Effects

Part 14: Combining OCIO with After Effects

Part 15: Logarithmic file formats

Part 16: RAW video files

Unscripted: Looking at ACES and OCIO in After Effects 2023

Part 17: Linear Compositing

Part 18: Bit Depth

Part 19: Introducing High Dynamic Range

Part 20: High Dynamic Range Compositing just looks better!

Part 21: HDR Formats, Colorspaces and TLAs

Part 22: Introducing Tone Mapping

Part 23: HDR Tone Mapping

Part 24: Corporate Branding Colors with Standard Dynamic Range

AND – I’ve been writing After Effects articles and tutorials for over 20 years. Please check out some of my other ProVideo Coalition articles.

 

]]>
https://www.provideocoalition.com/color-management-part-25-corporate-brand-colors-and-aces/feed/ 1
The Angry Editor: Template Search is the new Music Search time suck https://www.provideocoalition.com/the-angry-editor-template-search-is-the-new-music-search-time-suck/ https://www.provideocoalition.com/the-angry-editor-template-search-is-the-new-music-search-time-suck/#comments Fri, 20 Dec 2024 15:00:11 +0000 https://www.provideocoalition.com/?p=287169 Read More... from The Angry Editor: Template Search is the new Music Search time suck

]]>
Welcome to The Angry Editor. 

For years I’ve teased the idea of The Angry Editor as a regular column here on The Editblog as a way to complain, gripe, nitpick, bellyache as well as air grievances out loud and in public when 280 characters on Twitter doesn’t work. 

Today’s topic:

The massive time suck when it comes to searching stock sites for a good graphics template.

When editors are talking shop, we often talk about how much of a time suck it is to search for good stock music. The stock music sites are plentiful, with some good music in them and a few of them even have some really good search tools. But it can still be a black and dark abyss. As you audition tracks, find a few good ones to narrow down to what you want to use.

But I’ll add a new time suck that is even worse than looking for good music. It’s scouring the stock sites for a good graphics template. That template might be as simple as a single line lower third all the way up to a multi-second opening title package. It’s a pretty amazing time we live in, (especially for those like myself who aren’t overly talented in motion graphic design) that there are template sites and stock sites that let us pay a reasonable rate for graphics tailored to our specific editing platform of choice. But as many of these sites have been around for years, and new content streams in on a regular basis, it’s getting to where many of them are a mess.

The Angry Editor: Template Search is the new Music Search time suck 45
Hey look Envato, there’s only 16,000 lower thirds for me to sort through to try to find one I might like to try.

I know what you’re saying, “Hey angry editor, be less of a dumb editor and use some of the filtering options. That’ll help you narrow your selection down.” And that’s true. By choosing a few filters, I did get it down to just over 3,000.

The Angry Editor: Template Search is the new Music Search time suck 46
3000 is better than 16000. I’ll give you that.

The problem now is that so many of the options that are returned have multiple graphics within a single returned result. So if even half of these 3000 returned results have 9 different lower third options within the single result, I’m back to my 16,000 number. And I guess a “TITLES PACK” might have what I’m looking for because technically a lower third is a title but the preview thumbnail is so bad I won’t know that until I click into that “Corporate Titles Pack.”

But wouldn’t this be a great place for AI to help solve this voluminous and monotonous search nightmare?

It just so happens Envato Elements has an “AI Labs” tab with an AI search! That should make it much quicker to find a simple two-line lower third that has an elegant animation on and off the screen.

Let’s go!

The Angry Editor: Template Search is the new Music Search time suck 47
Isn’t AI supposed to help with the boring busywork?

Nice job Envato AI labs.

Perhaps I’ll have better luck with Motion Array.

The Angry Editor: Template Search is the new Music Search time suck 48
Hey that’s less than 13,000!

Well that’s a whole lot of results also returned. I actually expect a lot of results with such a simple term as “lower thirds,” as there’s a whole lot that can encompass. Let’s get a bit more specific because I know exactly what I want, a simple two-line lower third that has an elegant animation on and off the screen.

The Angry Editor: Template Search is the new Music Search time suck 49
Hey that’s 0!

I think you get the idea here.

Honestly, this grievance is less about the number of search results returned when looking for video templates and more that these websites have pretty bad granular search and filtering options. While there are plenty of search filters, it still seems difficult to narrow down findings. I know they are trying, they just need to be better.

In the meantime, perhaps we’re just resolved to a world of having to spend hours upon hours searching and digging.

Now don’t get me started on the quality of a lot of what these stock sites are selling these days, or the instructions on how to use some of this stuff.

If you have a stock template site that you just love, or want to defend Envato or Motion Array (I’m picking on them because those are ones I’ve used extensively) then please let us know in the comments below.

I asked around on social media how others might feel about this topic.

Yup absolutely. I almost wonder if it’s a marketing thing. You look for something, get a ton of results and think “oh there’s heaps of stuff here!”

I mean it’s fine, I found some good stuff, but it seemed harder than maybe it needed to be.

— Dylan Reeve (@dylanreeve.com) December 18, 2024 at 1:50 PM

Music search has always been a horrible time suck. Clients never appreciate it. It’s always wonderful when I have someone else who will do it for me.

— Robb, with 2 b’s (@nledit.bsky.social) December 18, 2024 at 1:58 PM

On that subject – why do stock footage companies never have any mechanism to re-conform the watermarked preview clip with the paid-for full quality version (like actual matching timecode or something)!?

— dickij10.bsky.social (@dickij10.bsky.social) December 18, 2024 at 2:09 PM

It might be worthwhile to build a good relationship with a Motion Designer. The need is often somewhat repetitive (titles, l3rds) and shouldn’t incur a massive cost. It used to be my job, and I still make all the graphics I need. I was always happy to do quick gigs while working on longer projects.

— Jukka Tallinen (@jukkatallinen.com) December 19, 2024 at 3:09 AM

]]>
https://www.provideocoalition.com/the-angry-editor-template-search-is-the-new-music-search-time-suck/feed/ 1
Tool Tip Tuesday for Adobe Premiere Pro: The Good Column https://www.provideocoalition.com/tool-tip-tuesday-for-adobe-premiere-pro-the-good-column/ https://www.provideocoalition.com/tool-tip-tuesday-for-adobe-premiere-pro-the-good-column/#respond Tue, 17 Dec 2024 13:00:51 +0000 https://www.provideocoalition.com/?p=287008 Read More... from Tool Tip Tuesday for Adobe Premiere Pro: The Good Column

]]>
Welcome to Tool Tip Tuesday for Adobe Premiere Pro on ProVideo Coalition.

Every week, we will share a new tooltip to save time when working in Adobe Premiere Pro.

Tool Tip Tuesday for Adobe Premiere Pro: The Good Column 56

Have you ever wanted a way to do something as simple as mark a single clip as good? You can do just that by using the ‘Good’ metadata column – it’s a checkbox that you simply click to check and label a clip as good.

First, set clips to list view in a bin as you need to display the Good metadata column. Then, go under the panel menu in the tab of the panel you media is in.

Tool Tip Tuesday for Adobe Premiere Pro: The Good Column 57

This menu opens the Metadata Display dialog box. When that is opened, type the word “good” into the search bar at top. There are two good metadata column option, and the one that you want to check is the one at top for Premiere Pro Project Metadata.

Tool Tip Tuesday for Adobe Premiere Pro: The Good Column 58

When that metadata column is turned on, you will have a new Good metadata column that is just a checkbox. By clicking on that Good checkbox, you can mark a clip as “good.”

Tool Tip Tuesday for Adobe Premiere Pro: The Good Column 59

Double-click on a clip to load it in the source monitor. Scan around to see if it’s one you like, and then check the box to mark it as “good.”

To make this even more useful, create a new Search Bin that will gather all the clips marked with this “good” metadata.

Go under File > New > Search Bin (or click the little Search Bin icon next to the search box at the top of any bin) to bring up the “Create Search Bin” dialog box. Then go under the top pop-up menu and choose “Good” from the list of metadata options. In order to create a new search bin that collects just the clips you’ve marked as good, type ‘true’ in the Find field.

Tool Tip Tuesday for Adobe Premiere Pro: The Good Column 60

With that Search Bin created, you will have a new bin in your project that only includes those clips where the ‘Good’ column has been checked on.

Tool Tip Tuesday for Adobe Premiere Pro: The Good Column 61

As you check or uncheck clips as “Good” throughout your edit, they will populate this new Search Bin.

This series is courtesy of Adobe. 

]]>
https://www.provideocoalition.com/tool-tip-tuesday-for-adobe-premiere-pro-the-good-column/feed/ 0