ProVideo Coalition https://www.provideocoalition.com A Filmtools Company Fri, 03 Jan 2025 21:40:10 +0000 en-US hourly 1 https://wordpress.org/?v=6.3.5 https://www.provideocoalition.com/wp-content/uploads/cropped-PVC_Logo_2020-32x32.jpg ProVideo Coalition https://www.provideocoalition.com 32 32 Samsung ALoP: a revolutionary smartphone lens structure https://www.provideocoalition.com/samsung-alop-a-revolutionary-smartphone-lens-structure/ https://www.provideocoalition.com/samsung-alop-a-revolutionary-smartphone-lens-structure/#respond Fri, 03 Jan 2025 15:59:21 +0000 https://www.provideocoalition.com/?p=287542 Read More... from Samsung ALoP: a revolutionary smartphone lens structure

]]>
Samsung AloP: a revolutionary smartphone lens structureSamsung’s ALoP introduces a large aperture lens that promises low-noise portrait images in night shots and a lower-profile camera bump and slimmer smartphone. The upcoming Galaxy S25 Ultra may be the first to use the technology.

In November 2024, Samsung Electronics won a total of 29 ‘CES Innovation Awards 2025’ ahead of the world’s largest and most influential technology event, CES 2025. One Samsung technology honored with a CES 2025 Innovation Award is All Lenses on Prism (ALoP). This revolutionary structure that enables smaller telephoto camera modules while capturing bright and clear photos, will be showcased at the Las Vegas Convention Center from January 7-10, 2025.

As mobile phone users demand better image capture from their cameras, smartphone makers have added cameras for each zoom ratio (e.g. wide, ultra-wide, telephoto). Over time, the phone camera array has become quite crowded, with an ever-larger camera bump. That may be about to change thanks to All Lenses on Prism (ALoP), developed by Samsung’s Sensor Solutions Team.

The technology, which was officially announced in November 2024, may debut in the upcoming Samsung Galaxy S25 Ultra smartphone, the upcoming company’s flagship model, which, according to rumors, will have the same lens set as the S24 Ultra, except for the ultrawide camera, which will drop the actual 12MP and be based on a 50MP sensor with a 1/1.57″ optical size and 1.0 µm individual pixels the Samsung ISOCELL S5KJN3. There is also a chance that AloP will first appear on a rumored Galaxy S25 Slim, a new addition to the family, that will be used to show the potential of AloP to incorporate a set of large telephoto lenses in a smartphone without incurring the penalty of a huge camera bump.

Samsung AloP: a revolutionary smartphone lens structureLimitations of the folded telephoto camera

Telephoto cameras have been a new point of differentiation for smartphone manufacturers because they offer high-magnification capabilities, can compress backgrounds, reduce distortion due to the narrow field of view, and even create a suitable background blur effect, ideal for portraits… but as they reach ratios up to 5x, taller module height and an obtrusive camera bump are unavoidable.

The common solution is the use of folded telephoto camera structure, bending the path of light by 90 degrees like a periscope to allow for longer focal lengths horizontally without extending the thickness of smartphone vertically. The lenses stand vertically with respect to the plane of the smartphone body, so their diameter determines the height of the camera bump. This made higher magnification telephoto cameras in smartphones possible… but introduced limitations in terms of light gathering.

Samsung AloP: a revolutionary smartphone lens structureIn fact, the folded telephoto camera structure limits the telephoto improvements that can be made in terms of image brightness.  A wider lens diameter is required for brighter images. Moving to a larger telephoto lens and brighter telephoto camera using large image sensor increases both the module height and length, to the point where the user would find the resulting bulky camera bump objectionable.

To address these issues, the industry has been looking for a solution that can achieve better results than traditional folded zoom structures. That’s AloP, a technology that employs a clever optical structure in which lenses sit horizontally upon the prism, remaining in the plane of the smartphone body. Using this approach, increasing the effective lens size (EPD) by increasing lens diameter brings a brighter image yet does not affect the camera module shoulder height. Moreover, it provides for a shorter module length by reducing the space needed for lenses in the folded camera module.

Samsung AloP: a revolutionary smartphone lens structureALoP: features and benefits

Here are the key features and benefits AloP brings to smartphones:

Brightness. The novel optics design of ALoP accommodates an f/2.58 lens aperture at a focal length of 80mm. Differing from conventional folded camera optics, the lens in this case is placed ahead of the prism. In this way, ALoP can use a large aperture lens that promises low-noise portrait images in night shots.

Compact size. Thanks to the ALoP architecture, the module length can be shortened 22% with respect to conventional folded camera optics. More importantly, the ALoP takes up an especially low module height because it employs a 40˚-tilted prism reflection surface and 10˚-tilted sensor assembly. Taken together, these reduced dimensions make for a lower-profile camera bump and slimmer smartphone.

Aesthetics and ergonomics. Users generally find a thick smartphone camera bump objectionable. It not only makes the smartphone design unappealing but also harder to use when laid on a flat surface. Additionally, the shape of the lenses within the camera bump can be off-putting. In a smartphone using conventional folded camera optics, users see a rectangular prism that is cosmetically somewhat jarring to an otherwise sleek camera appearance. By contrast, in a smartphone adopting ALoP optics, users see only the expected circular lens shapes.

According to Younggyu Jeong, from Samsung’s Sensor Solutions Team, “smartphone cameras are rapidly evolving, but they haven’t yet surpassed DSLR capabilities due to size constraints. I believe new technologies will continue to emerge to close that gap, with technologies aimed at improving the image quality of telephoto cameras, which are disadvantaged in terms of form factor, at the forefront.”

“As a business” – he added -, “we anticipate the commercialization of various technologies to reduce F/#, increase zoom magnification, and reduce module size. We will continue to combine the differentiated hardware solutions of ISOCELL with AI-based software solutions to expand users’ mobile camera experiences.”

]]>
https://www.provideocoalition.com/samsung-alop-a-revolutionary-smartphone-lens-structure/feed/ 0
New year’s resolutions for crew https://www.provideocoalition.com/a-victorious-2025/ https://www.provideocoalition.com/a-victorious-2025/#respond Thu, 02 Jan 2025 20:07:24 +0000 https://www.provideocoalition.com/?p=287516 Read More... from New year’s resolutions for crew

]]>
The aftermath of a party, with glitter on the floor and empty wine glasses on the coffee table.

Resolution is a word that gets more airtime than it should in a world where pocket-money cameras have four times the sharpness of classic cinema. In fact, ending that preoccupation with numbers should go on a list of things we’ll try to do in 2025. A new year’s not-resolution, perhaps.

Here’s a few others.

Take user-generated content more seriously

If you’re perpetually engaged in senior positions on high-end projects at union rates, it’s easy to overlook changes in the wider industry. Of course, a lot of people are conspicuously not in that position at the moment, and there’s a lot of things to blame: the peak and decay of streaming, the hangover of the pandemic, pricy-but-mediocre franchise films and streaming series, and industrial action which, whether we like it or not, certainly kicked an industry when it was down.

The fact that this all happened at the point where user-generated content was ascendant is no coincidence, but certain markets have been able to ignore that reality because YouTube has not so far been capable of funding an Alexa 65 and a set of DNAs. That probably hasn’t changed yet, although some shockingly high-end work is being done. The Dust channel has been putting out user-generated sci-fi for aNo while, and while much of Dust’s output might not quite satisfy Netflix subscribers, it is naive to assume that the status quo is eternal.

Snobbery is involved, though as a business consideration the rise of user-generated content is a question for the c-suite more than camera crews. Other things, though, are more in the hands of the craftspeople.

 

Young woman sitting in front of a ring light applying makeup.
Here we see the entire production, directorial and post team at work. Yes, when you and your one million buddies can put the wind up Disney using ten-dollar Aliexpress ring lights and iPhones, you are worth taking seriously. By Pexels user George Milton.

Recognise production design

Given film is so much a team sport, the lack of communication between departments is often slightly shocking. Perhaps that’s because it is also a very expensive artform, provoking a nervousness which tends to keep people firmly in lane. A film set is a place where it is often better to keep silent and be thought a fool. In the abstract, most people are keenly aware that there is no good cinematography without good production design, but that’s easily forgotten in the midst of pixel peeping the latest camera release (of which more anon).

Sometimes, production design means months of preparation. Sometimes, it just means picking the right time and place. Still, interdepartmental collaboration is sometimes more competitive than it should be. That’s particularly true on less financially replete productions, where it may be accepted that the show will not compete with blockbusters but that nobody wants that outcome to be their fault. So, camera refuses to unbend for the location manager, or vice versa, and the result is unnecessarily compromised.

We could equally assign a couple of new years’ resolutions to other departments, encouraging them to recognise the need to, say, put the camera somewhere it can see both the actors at once. Ultimately, though, we should admit that too many people put too much importance on the camera, and not enough on what’s in front of it.

Be bold

Even lay audiences have started to notice that a certain proportion of mainstream film and TV has adopted a rather cautious approach to high contrast and saturated colour. Some of the accused productions have been comic book or animation adaptations, which probably ought to be the opposite. What’s even more counterintuitive is that this is invariably the product of digital cinematography, which was long held to be lacking in dynamic range – which is the same thing as high in contrast.

Grey concrete support pillars under a bridge, in grey mist.
This atmospheric photo by pexels user Markus Spiske is pretty, but a lot of modern film and TV sort of looks a bit like this even when it isn’t foggy.

Engineers have since given us fifteen-stop cameras, but there seems to be a lasting societal memory of early, less-capable electronic cinematography which makes people afraid of the extremes. It’s at least as likely that fiscal conservatism is leading to artistic conservatism around the sheer cost of nine-figure blockbusters. Nobody ever got in trouble for not crushing the blacks.

The result is an identifiable lack of punch in movies and TV shows that even determinedly nontechnical people are starting to notice. There’s a whole discussion to have about history, how things once looked, and how they look now, but with modern grading we can have anything. The solution is easy, if the producers will stand it: be not afraid of minimum and maximum densities – unless you’re grading for HDR, in which case absolutely be afraid, but that’s another issue.

Stop pixel peeping

And yes, like a lecturing parent frustrated with a chocolate-smeared child’s perpetual tendency to steal cookies, we do have to talk about that obsession with numeric specifications. It is the camera department’s equivalent of bargain vodka. Everyone knows it’s a bad idea, but it starts off fun and we can stop whenever we like. Soon, though, we realise that cameras are now almost too good and pixel peeping has facilitated a generation which thinks that swear-box words like “cinematic” and “painterly” are objectively measurable. Then it turns out that our attractively-priced metaphorical booze was mostly brake fluid, and people end up spending time counting megabits that should have been spent working out a mutually-beneficial compromise with the location manager.

Everyone knows that good equipment is necessary. Everyone knows it isn’t sufficient. Everyone also knows that pixel peeping is a bad habit and complaining about it almost feels redundant. But if we can make 2025 the year when film students use social media to discuss technique more than they discuss technology, that’ll be a minor victory.

]]>
https://www.provideocoalition.com/a-victorious-2025/feed/ 0
The world’s first bendable monitor wins three CES 2025 awards https://www.provideocoalition.com/the-worlds-first-bendable-monitor-wins-three-ces-2025-awards/ https://www.provideocoalition.com/the-worlds-first-bendable-monitor-wins-three-ces-2025-awards/#respond Thu, 02 Jan 2025 19:43:34 +0000 https://www.provideocoalition.com/?p=287532 Read More... from The world’s first bendable monitor wins three CES 2025 awards

]]>
The world’s first bendable monitor wins three CES 2025 awards 2The new lineup includes the 45GX990A monitor – winner of three CES 2025 Innovation Awards, including the prestigious ‘Best of Innovation’ -, and the world’s first 45-Inch 5K2K OLED gaming monitor.

Don’t let the “gaming monitor” classification fool you. The two models announced may not be your next video or photo editing monitor solution, but the technology used will, no doubt, be added to future models and other segments. The 45GX990A and 45GX950A are 45-inch, 21:9 gaming monitors featuring ultra-high 5K2K resolution (5,120 x 2,160) – a first for OLED monitors, as of December 2024. According to LG, “their 21:9 aspect ratio offers a more immersive gaming experience than standard 16:9 displays, while maintaining better content compatibility than 32:9 monitors. The LG UltraGear OLED Gaming Monitor (model 45GX950A), for example, is a “gaming monitor” that, as LG notes, offer “generous screen real estate”, making it “a great choice not only for gaming but for various different uses.”

Both products feature LG’s second-generation Dual-Mode, offering customizable aspect ratios (21:9 or 16:9) and picture sizes (39-, 34- or 27-inches) with one-touch switching between preset screen-resolution and refresh-rate combos.

The LG UltraGear OLED Bendable Gaming Monitor (model 45GX990A) is the world’s first 5K2K-resolution bendable OLED display. The 45-inch monitor can smoothly transition from completely flat to a 900R curvature within seconds, offering users incredible flexibility and more control over their gaming experience. Its upgraded Dual-Mode feature allows users to switch effortlessly between resolution and refresh rate presets, and customize the aspect ratio and picture size. With an ultra-fast 0.03ms (GtG) response time, the 45GX990A ensures smooth gameplay and heightened immersion.

The world’s first bendable monitor wins three CES 2025 awards

Another standout model from the new GX9 lineup is the LG UltraGear OLED Gaming Monitor (model 45GX950A), which LG announces as “the World’s First 45-Inch 5K2K OLED Gaming Monitor with 800R Curvature and DisplayPort 2.1”. With its curved (800R), 21:9 format 5K2K-resolution self-lit 45-inch panel, this display delivers sharp, lifelike images with the stunning colors and exceptional contrast LG OLED products are known for. Its 4-side Virtually Borderless design and slim bezels help boost users’ sense of immersion while adding a sleek aesthetic to any setup. Boasting 125 pixels per-inch (PPI) and an RGWB subpixel layout, the monitor improves the readability of in-game text and makes productivity tasks, such as editing documents or website content, that much easier.

Like its bendable sibling, the 45GX950A features Dual-Mode functionality with eight customizable configurations and supports DisplayPort 2.1, HDMI 2.1 and USB-C with 90W power delivery. This ensures seamless compatibility with the latest graphics cards and features such as variable refresh rate (VRR) while enabling convenient device charging. Certified by NVIDIA G-SYNC and AMD FreeSync Premium Pro, the monitor has reduced screen tearing for a smoother, more responsive gaming experience.

The world’s first bendable monitor wins three CES 2025 awardsFor seamless streaming and immersive gaming

The brand-new LG UltraGear 39GX90SA is designed to deliver, LG claims, “stellar gaming and content-streaming experiences. Powered by webOS, it functions as a home entertainment hub, enabling users to access all their go-to streaming services without a PC or set-top box. It’s 39-inch, 21:9 aspect ratio curved (800R) OLED display produces brilliant, nuanced colors and deep, dark blacks, making it perfect for AAA games and HDR movies and series. Equipped with USB Type-C ports, it offers convenient connectivity, and incorporates LG’s ergonomic and space-saving L-shaped stand for a clutter-free desk setup.”

“The UltraGear GX9 series sets a new standard for OLED gaming monitors, combining groundbreaking display technology with smart features that expand and enhance the user experience,” said YS Lee, vice president and head of the IT Business Unit, LG Media Entertainment Solution Company. “From the world’s first 5K2K OLED gaming monitors with second-generation Dual-Mode to smart gaming monitors with built-in webOS, the GX9 lineup pushes the boundaries to deliver maximum value and enjoyment for our customers.”

]]>
https://www.provideocoalition.com/the-worlds-first-bendable-monitor-wins-three-ces-2025-awards/feed/ 0
4:3 programs upscaled to HD or 4K: technical & aesthetic decisions https://www.provideocoalition.com/43-programs-upscaled-to-hd-or-4k-technical-aesthetic-decisions/ https://www.provideocoalition.com/43-programs-upscaled-to-hd-or-4k-technical-aesthetic-decisions/#comments Wed, 01 Jan 2025 20:31:14 +0000 https://www.provideocoalition.com/?p=287504 Read More... from 4:3 programs upscaled to HD or 4K: technical & aesthetic decisions

]]>
I have previously covered some of the options when upscaling 4:3 programs to HD or 4K while retaining the original aspect ratio. One of those decisions is whether to eliminate headswitching via cropping or masking (or not). Another is whether to deliver the 4:3 material with a presumed pillarbox… or to place it inside of the image of a classic CRT TV set. In this new article, I’ll cover the other options when desiring a presumed pillarbox when determining the actual resolution of the deliverable file. For example, if a 4:3 program is to be upscaled to 1080p while retaining the original aspect ratio with a presumed pillarbox, the 1080 number refers to 1080 vertical pixels, not the horizontal pixels. We actually have two options in this case: 1440×1080 with all active square pixels on a borderless canvas… or a 1920×1080 canvas where the pillarbox is actually part of the canvas. Let’s explore the pros and cons of each.

1440×1080

(all pixels active, all square)

with a borderless canvas

1920×1080 canvas

where the pillarbox
is part of the canvas,
plus 1440×1080
in the center of the canvas

File size

lower
(more efficient)

higher
(less efficient)

Compatible with YouTube?

Not officially documented,
but currently works,

per this great example.

Of course!
Also adds the option of a logo
in the pillarbox,
like this wonderful example.

Compatible with Netflix

Many complaints:
Netflix currently stretches 4:3 shows

to 16:9.

Of course!

Compatible with Amazon Prime Direct

Per the official documentation,
4:3 1440×1080 is not supported
(by omission),
although 4:3 640×480 is.

I have not tested it personally.

Of course!

Set top boxes and HDTV sets from a USB memory stick.

It varies. Some do.
Others misinterpret it as
anamorphic and stretch it to 16:9.

Of course!
Also adds the option of a logo
in the pillarbox.

 

Great example of 1440×1080 (4:3) on YouTube

Despite the 16:9 thumbnail, the above video’s YouTube stats indicate that it is a 1440×1080. When you play it, you’ll see that it’s 4:3.

 

Wonderful example of 4:3 (1440×1080) on a 1920×1080 canvas on YouTube

4:3 programs upscaled to HD or 4K: technical & aesthetic decisions 5

The above video has a Norman Lear Effect logo in the canvas, in the pillarbox.

 

Feasibility of getting a deliverable file at 4:3 (1440×1080) w/square pixels

As I have explained, there are many advantages and disadvantages of delivering either 4:3 content on top of a wider 1920×1080 canvas with pillarbox… versus delivering a pure 4:3 (1440×1080) file with square pixels, depending upon the delivery venue. If you would like to use a pure 4:3 (1440×1080) square pixel file, hopefully your software will be able to fulfill your wish and deliver it properly, without ruining it.  However, if your software (like many) won’t give you that and you are forced to create an intermediary file with 4:3 content on top of a wider 1920×1080 canvas, then you may consider using the Apple Photos app on macOS to crop that intermediary video file properly and as desired. At first, it might seem wrong to do this with the Apple Photos app on macOS for video files, but Gary from MacMostVideo will prove that it is quite correct with the following video!

4:3 programs upscaled to HD or 4K: technical & aesthetic decisions 6

At 3:26 of the above video, you will see that there is even a 4:3 cropping preset in Apple Photos for macOS to facilitate this process. (Gary mentions the square cropping preset, but the 4:3 cropping preset is also visible there.) If you click on the 4:3 cropping preset, your cropping tool will be locked to that aspect ratio to make it extremely simple.

If you know of any similar app that does true video cropping for Linux or Windows (truly changing the dimensions), please comment its name below this article.

 

Related articles

 

(Re-)Subscribe for upcoming articles, reviews, radio shows, books and seminars/webinars

Stand by for upcoming articles, reviews, books and courses by subscribing to my bulletins.

In English:

En castellano:

Most of my current books are at books.AllanTepper.com, and also visit AllanTepper.com and radio.AllanTepper.com.

FTC disclosure

RØDE has not paid for this article. RØDE has sent Allan Tépper units for review. Some of the manufacturers listed above have contracted Tépper and/or TecnoTur LLC to carry out consulting and/or translations/localizations/transcreations. So far, none of the manufacturers listed above is/are sponsors of the TecnoTurBeyondPodcastingCapicúaFM or TuSaludSecreta programs, although they are welcome to do so, and some are, may be (or may have been) sponsors of ProVideo Coalition magazine. Some links to third parties listed in this article and/or on this web page may indirectly benefit TecnoTur LLC via affiliate programs. Allan Tépper’s opinions are his own. Allan Tépper is not liable for misuse or misunderstanding of information he shares.

]]>
https://www.provideocoalition.com/43-programs-upscaled-to-hd-or-4k-technical-aesthetic-decisions/feed/ 1
After Effects News December 2024 https://www.provideocoalition.com/after-effects-news-december-2024/ https://www.provideocoalition.com/after-effects-news-december-2024/#respond Wed, 01 Jan 2025 06:54:45 +0000 https://www.provideocoalition.com/?p=287466 Read More... from After Effects News December 2024

]]>
Adobe’s AI gumbo at Max 2024 10After Effects 25.1 is the current version, with no big recent announcements. The December 2024 (25.1) release includes important fixes, and Adobe listed remaining Known issues in After Effects.

New features include:

  • Accepts Lights switch for 3D layers
  • 3D model preview thumbnail
  • Cinema 4D 2025 upgrade in After Effects

After Effects 25.2 is in Beta; current features include:

  • Customize Transparency Grids
  • Customize panel background colors
  • Change anchor point in a more efficient way

 

After Effects News December 2024 9
PVC staff posted Looking back on 2024 and ahead to 2025 – A PVC Roundtable Discussion. After Effects specialist Chris Zwar bucked the crowd source opinion on HDR, saying that the spread of HDR & ACES has helped After Effects become increasingly accepted as a tool for high-end VFX. Unfortunately, HDR adoption is slow.

Chris shared related information in his video article Color Management Part 25: Corporate Brand Colors and ACES.After Effects News December 2024 10

 

Envato Video looked at Motion Design Trends for 2025, hosting four designers to learn what’s inspiring them, what’s driving them crazy, and how motion graphics designers can stay at the top of their game.

 

The School of Motion yearly summary is up, links and all. See the sample of 15 minutes, but it’s useful to catch the entire 6 1/2 hours. Here’s Highlights from our EPIC 2024 Year-End Roundup Podcast! Who are those pixies working for? Could it be…..?

 

 

Jake In Motion posted 2 substantial tutorials in December: Advanced CC Ball Action Techniques /// After Effects Tutorial and Advanced Grunge Texture /// After Effects Tutorial.

 

 

Noble Kreative used CC Ball Action for 3D Procedural Audio Visualizer in After Effects and Procedural Fiery Flames and Wavy Backgrounds in After Effects.

 

 

Film Riot posted Insane Asteroid Impact in After Effects. They heavily massaged stock footage to get there.

 

 

ukramedia is back with The After Effects Marker Trick Nobody Talks About! See how to control multiple elements across After Effects compositions using a single marker. You can set up markers to manage text, colors, and other properties efficiently with expressions, to simplify updating multiple items at once, saving time on complex projects. By automating changes with markers, you can streamline your workflow and avoid repetitive manual adjustments across compositions.

 

 

Stephan Zammit makes headway on creating 3D glass in Realistic 3D Motion Graphics in After Effects Advanced Tutorial.

 

 

One of several recent tutorials on the theme is Premiere Gal’s AWESOME Futuristic HUD Effects in After Effects.

 

 

Boone Loves Video adds a quick 3D airplane for a map animation with Working with 3D Models in Adobe After Effects 2025! 🗺.

 

 

After Effects Basics shared 10 After Effects HACKS Every Motion Designer NEEDS!

Still going strong, Eran Stern upped the ante with 10 Ae Techniques You’ve Never Seen!

 

 

And, Texturelabs is back with 21 PHOTOSHOP TIPS – Easy Through Advanced! Meanwhile, in dekeNow’s 7 Techniques That Will Change How You Use Photoshop Forever, the first tip (Auto Color Options) immediately impressed.

 

 

Check out ObjPop Beta from the Plugin Play toolset for AE in Instant Image to 3d model in After Effects. There’s already 3D animations from video footage elsewhere, but this is inside AE.

 

 

If you’ve dragged your feet on this, Default Cube says It’s Time For Gaussian Splatting // Tutorial. That one uses Blender, but there’s more on Gaussian splats in AE in After Effects Roundup October 2024.

 

 

Javier Mercedes shows you how to Reveal Text From Behind Object (Premiere Pro and After Effects).

 

 

Adobe Spectrum is the name for the new UI style in Creative Cloud. VideoRevealed has the justification for Premiere Pro (without nagging details) in Spectrum – Adobe’s Design System!

 

 

VFX and Chill hosted Talking Animals in After Effects (with JASON MURPHY). From talking dogs to practical puppets to CGI aliens, they chat about how Jason Murphy found himself directing movies full of so many visual effects shots. “How did After Effects fit into a production pipeline?”

 

 

Josh Toonen explains Why I Quit After Effects (to work in Hollywood VFX). You’re supposed to delete AE and learn Nuke for free (course fees not included). But really, you can use Nuke for free and, apparently, Unreal Engine is useful.

]]>
https://www.provideocoalition.com/after-effects-news-december-2024/feed/ 0
Looking back on 2024 and ahead to 2025 – A PVC Roundtable Discussion https://www.provideocoalition.com/looking-back-on-2024-and-ahead-to-2025-a-pvc-roundtable-discussion/ https://www.provideocoalition.com/looking-back-on-2024-and-ahead-to-2025-a-pvc-roundtable-discussion/#respond Wed, 01 Jan 2025 04:00:23 +0000 https://www.provideocoalition.com/?p=287445 Read More... from Looking back on 2024 and ahead to 2025 – A PVC Roundtable Discussion

]]>
Looking back on 2024 and ahead to 2025 – A PVC Roundtable Discussion 22

While the hosts of the Alan Smithee podcast already discussed the evolving landscape in media and entertainment as 2024 draws to a close, there’s so much more to say about what happened in 2024 and what 2025 has in store for individual creators and the entire industry. Generative AI for video is everywhere, but how will that pervasiveness impact actual workflows? What were some of the busts in 2024? How did innovations in cameras and lenses make an impact? And what else are we going to see in 2025 beyond the development of AI animation/video tools?

Below is how various PVC writers explored those answers in a conversation took shape over email. Keep the conversation going in the comments or on LinkedIn. 

 

Looking back on 2024 and ahead to 2025 – A PVC Roundtable Discussion 23Scott Simmons

2024? Of course, the first thing you think about when recapping 2024 (and looking ahead to 2025) is that it was all about artificial intelligence. Everywhere you look at technology, media creation, and post-production, there is some mention of AI. The more I think about it, though, the more I feel like it was just a transitional year for AI. Generative AI products feel like old hat at this point. We’ve had “AI” built into our editing tools for what feels like a couple of years now. While we have made a lot of useful advancements, I don’t feel like any earth-shattering AI products shipped within any of our editing tools in 2024. Adobe’s generative extend in Premiere Pro is probably the most useful and most exciting AI advancement I’ve seen for video editors in a long time. But it’s still in beta, so Gen Extend can’t count until it begins shipping. The AI-based video searching tool Jumper shipped is a truly useful third-party tool. Adobe also dropped their “visual search” tool within the Premiere beta, so we know what’s coming next year as well, but it’s far from perfect at the time of this writing, and still in beta. I appreciate an AI tool that can help me search through tons of media, but if that tool returns too many results, I’m yet again flooded with too much information.

The other big AI-based video advancement is text-based generative video coming into its own this year. Some products shipped, albeit with quite a high price to make it truly usable. And even more went into preview or beta. 2025 will bring us some big advancements in generative video and that’s because we’re going to need them. What we saw shipping this year was underwhelming. A few major brands released AI-generated commercial spots or short films, and they were both unimpressive and creepy. I saw a few Generative AI short films make the rounds on social media, and all I could do after watching them was yawn. The people that seemed most excited by generative video (and most trumpeting its game-changing status) were a bunch of tech bros and social media hawks who didn’t really have anything to show other than more yelling on social media from their paid and verified accounts or their promoted posts.

Undoubtedly, AI will continue to infiltrate every corner of media creation. And if it can do things like make transcription more accurate or suggest truly usable rough cuts, then I think we can consider it a success. But for every minor workflow improvement that is actually useful in post-production, we’ll see two or three self-proclaimed game-changing technologies that end up just being … meh.

In the meantime, I’ll happily use the very affordable M4 Mac mini in the edit suite or a powerful M4 Mac Studio that speeds up the post-production process overall. We can all be thankful for cloud-based “hard drives,” like LucidLink, that do more for post-production workflow than most AI tools that have thrown our way. Maybe 2025 will be the year of AI reality over AI hype.

19-6-4-nikki-cole-backpack-headshotNikki Cole

While I’m aware that the issues we’ve been facing on the writing/producing/directing side don’t affect many of the tech teams on the surface, it has been rather earth-shattering on our side of things. Everyone fears losing their jobs to AI and with good reason. I am in negotiations with a European broadcaster who really likes the rather whacky travel series I’m developing but, like so many b’casters now, simply don’t have enough cash to fully commission it. They flat out told me that I would write the first episode, and they would simply feed the rest of the episodes into AI so they wouldn’t have to pay me to write more eps. I almost choked when they said that matter of factly, and I responded by saying that this show is comedy and AI can’t write comedy! Their response was a simple shrug of the shoulders. Devastating for me and with such an obvious lack of integrity on their part, I’m now concerned that they are currently going ahead with that plan and we don’t even have a deal in place. So, from post’s perspective, that is one more project that isn’t being brought into the suite because I can’t even get it off the ground to shoot it.

As members of various guilds and associations, we are all learning about the magnificent array of tools, we now have at our fingertips for making pitches, sizzle reels, visual references, etc. It really is astonishing what I now can do. I’m learning as much as I can while I just can’t shake the guilt knowing that I’ll be putting the great graphic designers I used to hire out of work. If budgets were a little better, I would of course, hire a graphic designer with those skills, but as things stand today, I can’t afford to do that.

It’s definitely a fascinating and perplexing time!

Looking back on 2024 and ahead to 2025 – A PVC Roundtable Discussion 24Oliver Peters

AI in various forms gets the press, but in most cases it will continue to be more marketing hype than anything else. Useful? Yes. True AI? Often, no. However, tools like Jumper can be quite useful for many editors. Although, some aspects, like text search, have already existed for years in PhraseFind (Avid Media Composer).

There are many legal questions surrounding generative AI content creation. Some of these issues may be resolved in 2025, but my gut feeling is that legal claims will only just start rolling for real in this coming year. Vocal talent, image manipulation, written scripts, and music creation will all be areas requiring legal clarification.

On the plus side, many developers – especially video and audio plugin developers – are using algorithmic processes (sometimes based on AI) to combine multiple complex functions into simple one-knob style tools. Musik Hack and Sonible are two audio developers leading the way in this area.

One of the less glitzy developments is the future (or not) of post. Many editors in major centers for film and TV production have reported the lack of gigs for months. Odds are this will continue and not reverse in 2025. The role for (or even need for) the traditional post facility is being challenged. Many editors will need to find ways to reinvent themselves in 2025. As many business are enforcing return-to-office policies, will editors find remote work to be less acceptable to directors?

When it comes to NLEs, Adobe Premiere Pro and Avid Media Composer will continue to be the dominant editing tools when collaboration or project compatibility is part of the criteria. Apple Final Cut Pro will remain strong among independent content creators. Blackmagic Design DaVinci Resolve will remain strong in color and finishing/online editorial. It will also be the tool for many in social media as an alternative to Premiere Pro or Final Cut Pro.

The “cloud” will continue to be the big marketing push for many companies. However, for most users and facilities, the internet pipes still make it impractical to work effectively via cloud services with full resolution media in real time. Of course, full resolution media is also getting larger and not lighter weight. So that, of course, is not conducive to cloud workflows.

A big bust in this past year has been the Apple Vision Pro. Like previous attempts at immersive, 3D, and 360-degree technologies, there simply is no large, sustainable, mass market use case for it, outside of gaming or special venues. As others have predicted, Apple will likely re-imagine the product into a cheaper, less capable variant.

Another bust is HDR video. HDR tools exist in many modern cameras and even smart phones. HDR is a deliverable for Netflix originals and even optional for YouTube. Yet the vast amount of content that’s created and consumed continues in good old Rec 709. 2025 isn’t going to change that.

2025 will be a year when the rubber meets the road. This is especially true with Adobe, who is adding generative AI for video and color management into Premiere Pro. So far, the results are imperfect. Will it get perfect in 2025? We’ll see.

Looking back on 2024 and ahead to 2025 – A PVC Roundtable Discussion 25Iain Anderson

The last twelve months have brought a huge amount of change. Generative AI might have had the headlines, but simple, short clips don’t just magically scale up to a 30-second spot, nor anything longer. One “insane, mind-blowing” short that recently became popular couldn’t even manage a consistent clothing for its lead, let alone any emotion, dialogue or plot. Gen AI remains a tool, not a complete solution for anything of value.

On the other hand, assistive AI has certainly grown a little this year. Final Cut Pro added automatic captioning (finally!) and the Magnetic Mask, Premiere Pro has several interesting things in beta, Jumper provides useful visual and text-based media search today, Strada looks like doing the same thing soon in the new year and several other web-based tools offer automatic cutting and organizing of various kinds. But I suspect there’s a larger change coming soon — and it starts with smarter computer-based assistants.

Google Gemini is the first of a new class of voice-based AI assistants which you can ask for help while you use your computer, and a demo showed it (imperfectly) answering questions about DaVinci Resolve’s interface. This has many implications for anyone learning complex software like NLEs, and as I make a chunk of my income from teaching people that, it’s getting personal. Still, training has been on the decline for years. Most people don’t take full courses, but just jump in and hit YouTube when they get stuck. C’est la vie.

While assistant AIs will become popular, AIs will eventually control our computers directly, and coders can get a taste of this today. Very recently, I’ve found ChatGPT helpful for creating a small app for Apple Vision Pro, for writing scripts to control Adobe apps, and also for converting captions into cuts in Final Cut Pro, via CommandPost. Automation is best for small, supervised tasks, but that’s what assistants do.

Early in 2025, an upgraded Siri will be able to directly control any feature that a developer exposes, enabling more complex interactions between apps. As more AIs become able to interpret what they see on our screens, they’ll be able to use all our apps quicker than we can. In video production, the roles of editor and producer will blur a little further, as more people are able to do more tasks without specialist help.

But AI isn’t the whole story here, and in fact I think the biggest threat to video professionals is that today, not as many people need or want our services. High-end production stalled with the pandemic and many production professionals are still short of work. As streaming ascends (even at a financial loss) broadcast TV is dying worldwide, with flow-on effects for traditional TV advertising. Viewing habits have changed, and will keep changing.

At the lower end, demand for quick, cheap vertical social media video has cut into requests for traditional, well-made landscape video for client websites or YouTube. Ads that look too nice are instantly recognised as such and swiped away, leading to a rise in “authentic” content, with minimal effort expended. It’s hard to make a living as a professional when clients don’t want content that looks “too professional”, and hopefully this particular pendulum swings back around. With luck, enough clients will realise that if everyone does the same thing, nobody stands out.

Personally, the most exciting thing this year for me is the Apple Vision Pro. While it hasn’t become a mainstream product, that was never going to happen at its current high price. Today, it’s an expensive, hard-to-share glimpse into the future, and hopefully the state-of-the-art displays inside become cheaper soon. It’ll be a slow road, and though AR glasses cannot bring the same level of immersion, they could become another popular way to enjoy video.

In 2024, the Apple Vision Pro was the only device to make my jaw drop repeatedly, and most of those moments have come from great 3D video content, in Immersive (180°) or Spatial (in a frame) flavors. Blackmagic’s upcoming URSA Cine Immersive camera promises enough pixels to accurately capture reality —  8160 x 7200 x 2 at 90fps — and that’s something truly novel. While I’m lucky to have an Apple Vision Pro today, I hope all this tech is in reach of everyone in a few years, because it really does open up a whole new frontier for us to explore.

P.S. If anyone would like me to document the most beautiful places in the world in immersive 3D, let me know?

Looking back on 2024 and ahead to 2025 – A PVC Roundtable Discussion 26Allan Tépper

In 2024, we saw more innovation in both audio-only and audio-video switchers/mixers/streamers and the democratization of 32-bit float audio recording and audio DSP in microphones and mixers. I expect this to continue in 2025. In 2024, both Blackmagic and RØDE revolutionized ENG production with smartphones. In 2024, I began my series about ideal digitization and conversion of legacy analog color-under formats including VHS, S-VHS, 8mm, Hi-8mm and U-Matic. I discussed the responsibility of proper handling of black level (pedestal-setup/7.5 or zero IRE) at the critical analog-to-digital conversion moment, proper treatment and ideal methods to deinterlace while preserving the original movement (temporal resolution). That includes ideal conversion from 50i to 50p or 59.94i to 59.94p as well as ideal conversion from non-square pixels to square pixels, upscaling to HD’s or 4K’s vertical resolution with software and hardware, preservation of original 4:3 aspect ratio (or not), optional cropping of headswitching, noise reduction and more. All of this will continue in 2025, together with new coverage of bitcoin hardware wallets and associated services.

In 2024, at TecnoTur we helped many more authors do wide distribution of their books, ebooks and audiobooks. We guided them, whether they wanted to use the author’s own voice, a professional voice or an AI voice. We  produced audiobooks in Castilian, English and Italian. We also helped them to deliver their audiobooks (with self distribution from the book’s own website) in M4B format with end-user navigation of chapters. I expect this to expand even more in 2025.

Looking back on 2024 and ahead to 2025 – A PVC Roundtable Discussion 27Brian Hallett

This past year, we saw many innovations in cameras and lenses. For cameras, we are now witnessing the market begin the movement to make medium-format acquisition easier for “everyday filmmakers” rather than just the top-tier. Arri announced its new ALEXA 265, a digital 65mm camera designed to be compact and lightweight. The new ALEXA 265 is one-third the size of the original ALEXA 65 while slightly larger than the still-new ALEXA 35. Yet, the ALEXA 265 is only available as a rental.

Regarding accessibility for filmmakers, the ALEXA 265 will not be easier to get one’s hands on; that will be reserved for the Blackmagic URSA CINE 17K 65. The Blackmagic URSA CINE 17K 65 is exactly the kind of camera Blackmagic Design and CEO Grant Petty wants to get into the hands of filmmakers worldwide. Blackmagic Design has a long history of bringing high-level features and tools to cameras at inexpensive prices. It is the company that bought DaVinci Resolve and then gave it away for free when purchasing a camera. They brought raw recording to inexpensive cameras early on in the camera revolution. Now, Blackmagic Design sees 65mm as the next feature reserved for the top-tier exclusive club of cinematographers they can deliver to everyone at a relatively decent price of $29,995.00, so expect to see the Blackmagic URSA CINE 17K 65 at rental houses sooner than later. I also wouldn’t let the 17K resolution bother you too much. Blackmagic RAW is a great codec that is a breeze to edit compared to processor-heavy compressed codecs.

We also saw Nikon purchase RED but have not seen the cross-tech innovation between those companies. In years to come, we will see Nikon add RED tech to its Nikon cameras and vice versa.

Sony delivered the new Sony BURANO and I’m seeing the camera at rental houses now. More so, though, I see more owners/operators with the BURANO than anything else. It appears Sony has a great camera that will last a long time for owners/operators.

I feel like I saw a ton of new lenses in 2024, from ARRI’s new Enso’s Primes to Viltrox’s anamorphic lenses. We see Chinese lenses coming in from every direction, which is good. More competition benefits all of us and keeps prices competitive. Sigma delivered their new 28-45mm f/1.8, the first full-frame F/1.8 maximum aperture zoom lens. I tested this lens on a Sony mirrorless, and it felt like the kind of lens you can leave on all day and have everything covered. The depth of field was great in every shot. Sigma has delivered a series of lenses for mirrorless E-Mount and L-Mount cameras at an astounding pace from 500mm down to 15mm.

Canon was miserly with their RF-mount. To me, Canon is protecting its investment into its lens innovation by restricting who can make RF mount lenses. I wish they wouldn’t do such a thing. It can be counter-intuitive to me to block others from making lenses that work on their new cameras. What has happened is that all those other lens makers are making lenses specific to E-Mount and L-Mount. In essence, if you are a BURANO shooter, you have more lenses available than a Canon C400 shooter. The story I tell myself is if I had to buy a camera today, which lenses I could use would be a part of that calculus.

On Artificial Intelligence, AI, we cannot discount how manufacturers use it to innovate quicker and shorten the timeframe from concept to final product faster while saving money. As a creative, I use AI and think of AI in this way: there will be creatives who embrace AI and those who don’t or won’t, and that will be the differentiator in the long run. I already benefit from AI with initial script generation, which is only a starting point, to caption generation and transcription and to using it in Lightroom for photo editing.

Looking back on 2024 and ahead to 2025 – A PVC Roundtable Discussion 28Damien Demolder

The production of high quality video used to be restricted by the cost of the kit needed and the skills required to operate that equipment. Those two things helped to regulate the number of people in the market. The last year though has seen a remarkable acceleration in the downward pricing trend of items that used to cost a lot of money, as well as an increase in the simplification and convenience of their handling. I tend to review equipment at the lower end of the price scale, an area that has seen a number of surprising products in the last twelve months. These days, anamorphic lenses are almost common-place hovering around the $1000 price point, and LED lights have simultaneously become cheaper and much more advanced.

Popular camera brands that until recently only dipped their toe into the video market now offer their own Log profiles, encourage their users to record raw footage to external devices and provide full 35mm frame recording as though it is the expected norm. LUTs can be created on a mobile phone app and uploaded to cameras to be baked in to the footage, and care-free 32bit float audio can be recorded directly to the video soundtrack for a matter of a few hundred dollars and a decent mic. Modern image stabilisation systems, available in very reasonably priced mirrorless cameras, mean we can now walk and film without a Steadicam, and best-quality footage can be streamed to a tiny SSD for long shoots and fast editing. Earlier this year I reviewed a sub-$500 Hollyland wireless video transmitter system that, with no technical set-up, can send 4K video from an HDMI-connected camera to four monitors or recorders – or to your phone where the footage can be recorded in FHD. I also reviewed the Zhiyun Molus B500 LED light that now provides 500W worth of bi-coloured illumination for less than $600, and the market is getting flooded with small, powerful LED, bi- and full colour, lights that run on mini-V Lock batteries – or their own internal rechargeable batteries.

Now, a lot of these products aren’t perfect and have limitations, but no sooner have the early adopters complained about the faults in the short-run first batch than the manufacturers have altered the design and fixed the issue to make sure the second phase will meet, and often exceed, expectations. We can now have Lidar AF systems for manual lenses, autofocus anamorphics, cheap gimbals so good you’d think the footage was recorded with a drone – even lighting stands, heavy duty tripods and rigging gear are getting cheaper and better at the same time.

Of course all this is great news for low-budget productions, students and those just starting out, but it also means anyone can believe they are a film maker. With the demand for video across social media platforms and websites ever increasing you’d think that would be great news for the industry, but much of that demand is being eaten up by those with no formal learning and some clever kit. Not all the content looks or sounds very good, but often that matters less than that a tiny budget is kept to. Those who think they are film makers can easily convince those who can’t imagine what their film could look like, that they are.

I expect 2025 will bring us more of this – better, more advanced and easier-to-use kit at lower prices, and more people using it. I didn’t need to consult my crystal ball for this prediction. Every year has brought the same gradual development since I joined the industry 28 years ago, but once again it has taken us to places I really hadn’t expected. I expect to be surprised again.

Looking back on 2024 and ahead to 2025 – A PVC Roundtable Discussion 29Nick Lear

I found 2024 to involve a lot of waiting. Waiting for the industry to right itself, waiting for the latest AI tool to come out of Beta, waiting for companies to put stability before bells and whistles. That, I fear, may be rather a long wait. I also found I had very mixed feelings about AI – on the one hand I was excited to see what advances technology could bring and on the other hand saddened by the constant signs that profit is put before people – whether in plagiarising artists’ work or the big Hollywood studios wanting the digital rights of actors.

Generative AI impresses me whenever I see it – and I think we have to acknowledge the rate of improvement in the last few years – but I also struggle to see where it can fit in my workflow. I am quite looking forward to using it in pre-production – testing out shots before a shoot or while making development sizzles. To that end, it was great to see Open AI’s text to video tool Sora finally come into the public’s hands this month, albeit not in the UK or Europe. Recently Google’s Veo 2 is getting hyped as being much more realistic, but it’s still in Beta and you have to live in the US to get on the waiting list. Adobe’s Firefly is also waitlist only – so there’s more waiting to be done – yet it could well be that 2025 brings all of these tools into our hands and we get to see what people will really do with them outside of a select few.

On the PC hardware front, marketing teams went into overdrive this year to sell us on new “AI” chips. Intel tried to convince us that we needed an NPU (neural processing unit) to run machine learning operations when there were marginal gains over using the graphics card we already had. And Microsoft tried to push people in the same direction – requiring specific new hardware to qualify for Copilot+. Both companies are trying to catch up with Apple on battery life, which I’m all for, but I wish they could be more straightforward about how they presented it.

I continued to get a lot out of the machine learning based tools, whether it was using a well trained voice model in Eleven Labs or upscaling photos and video with Topaz’s software. I also loved the improvements that Adobe made in their online version of Enhance Speech which rescued some bad audio I had to work with. Some of these tools are starting to mature – they can make my life easier and enable me to present better work to my clients which is all I want at the end of the day.

Looking back on 2024 and ahead to 2025 – A PVC Roundtable Discussion 30Jeff Foster

For me 2024 was met with lots of personal life challenges which has precluded me from a lot of deep dive involvement into the world of AI to the level I did in the previous 2 years, but I did manage to catch up on some of the more advanced generative AI video/animation tools to explore and demo for the CreativePro Design + AI Summit in early December. I created the entire 40 minute session with Generative AI tools including my walking intro chat and the faux Ted Talk presentation, using tools like HeyGen, ElevenLabs, Midjourney and Adobe After Effects. As usual, I did a complete breakdown of my process as a reveal toward the end of my session and will be sharing this process in a PVC exclusive article/video in January 2025.

The rate of development of AI animation/video tools such as Runway, Hailuo, Leonardo and others that are developing their text/image to video tools is astounding. I think we’re going to see a major shift in development in this area in 2025.

I’m also exploring music and audio AI generative tools including hardware/software solutions in the coming months and expect to see some amazing growth in quality, production workflows and accessibility to the public who are virtually non-musicians.

As usual, I’m only exploring the tools and seeing how they can be utilized, but also am concerned for the direction all of this is heading and how it affects content creators and producers in the end. I always take the position that these are tools we can utilize in our workflows (or at least have an understanding of how they work) or choose to ignore them and hope they’ll just go away like other fads… which they won’t.

Looking back on 2024 and ahead to 2025 – A PVC Roundtable Discussion 31Michelle DeLateur

Christmas morning is a flurry of impatience; usually an explosion of expectation, surprise, and wrapping paper scraps. I used to describe NAB as “camera Christmas”  to newcomers. But with announcements coming by email and NAB turning into primarily events, meetings, and conversations, that giddy elf feeling I used to have to see the new floor models has turned into excitement to see familiar faces.

So where has our impatience shifted? It seems we now find ourselves in a waiting game for presents in 2025.

That new Adobe Gen-AI video model? Hop on the waitlist. Hoping to see more content on the Vision Pro, perhaps with the Blackmagic URSA Cine Immersive? Not yet. Excited about Sigma making RF lenses? They started with APS-C first.

Patience is not one of our virtues. With shorter camera shelf lives and expected upgrade times, we assume we will hold onto our gear shorter than ever and are always ready for a change. Apple’s releases make our months-old laptops seem slow. A new AI tool comes out nearly every day.

Video editors are scrambling for the few job openings, adding to their skill sets to be ready for positions, or transitioning to short term jobs outside of video alongside the anxiety of AI threatening to take over this realm. We rejoiced when transcription was replaced by a robot. We winced when an AI program showed it could make viral-ready cuts.

Just because we are forced to wait does not mean we are forced to be behind. It is cheaper than ever to start a photography journey. Mastering the current tools can make you a faster editor. Teaching yourself and others can help create new stories. While I personally don’t fully believe in the iPhones filmmaking abilities, there ARE plenty of tools to turn the thing-that’s-always-on-you into a filmmaking device.

In 2024, we were forced to wait. But we are not good at waiting. That’s the same tenacity and ambition that makes us good at storytelling. It’s only a matter of time. It’s all a matter of time. So go forth and use your time in 2025.

Looking back on 2024 and ahead to 2025 – A PVC Roundtable Discussion 32Chris Zwar

For me, 2024 was a case of ’the more things change, the more they stay the same’. I had a busy and productive year working for a range of clients, some of whom I hadn’t worked with for many years. It’s nice to reconnect with former teams and I found it interesting that briefs, pitches and deliveries hadn’t changed a great deal with time.

The biggest change for me was finally investing in a new home workstation. Since Covid, I have been working from home 100%, but I was using an older computer that was never intended for daily projects. Going through the process of choosing components, ordering them and then assembling a dedicated work machine was very rewarding, and something I should have done sooner. Now that my home office has several machines connected to a NAS with 10-gig ethernet, I have more capacity at home than some of the studios I freelance for – something I would have found inconceivable only a few years ago!

Technically, it seems like the biggest impact that AI has had so far has been providing a topic for people to write about. Although AI tools continue to improve, and I use Ai-based tools like Topaz & Rotobrush regularly, I’m not aware of AI having had any impact on the creative side of the projects I’ve worked on.

From my perspective as an After Effects specialist, the spread of HDR & ACES has helped After Effects become increasingly accepted as a tool for high-end VFX. The vast majority of feature films and premium TV shows are composited with Nuke pipelines, but with ACES having been built-in to AE for over a year, I’m now doing regular cleanup, keying and compositing on projects in After Effects that wouldn’t have been available to me before.

]]>
https://www.provideocoalition.com/looking-back-on-2024-and-ahead-to-2025-a-pvc-roundtable-discussion/feed/ 0
Art of the Frame Podcast: Editors on Editing with “Nosferatu” editor Louise Ford https://www.provideocoalition.com/art-of-the-frame-podcast-editors-on-editing-with-nosferatu-editor-louise-ford/ https://www.provideocoalition.com/art-of-the-frame-podcast-editors-on-editing-with-nosferatu-editor-louise-ford/#respond Tue, 31 Dec 2024 16:41:50 +0000 https://www.provideocoalition.com/?p=287450 Read More... from Art of the Frame Podcast: Editors on Editing with “Nosferatu” editor Louise Ford

]]>
In the final episode of Editors on Editing for 2024, Glenn Garland interviews Louise Ford about her experience editing ‘Nosferatu,’ directed by Robert Eggers. Ford highlights the film’s lengthy development, explaining that Eggers’ obsession with the story began in childhood, leading to a ten-year journey to bring this remake of a seminal film to life. Their discussion is also focused on the film’s intricate editing process, revealing the collaboration among the crew as well as Ford’s approach to editing.

“It’s all just gut,” Ford said.

The Art of the Frame podcast is available on Apple Podcasts, Spotify, Google Podcasts, Anchor and many more platforms. If you like the podcast, make sure to subscribe so you don’t miss future episodes and, please leave a review so more people can find our show!

Support this podcast: https://podcasters.spotify.com/pod/show/artofthecut/support

]]>
https://www.provideocoalition.com/art-of-the-frame-podcast-editors-on-editing-with-nosferatu-editor-louise-ford/feed/ 0
Tool Tip Tuesday for Adobe Premiere Pro: What we’re most looking forward to in the next year in Premiere. https://www.provideocoalition.com/tool-tip-tuesday-for-adobe-premiere-pro-what-were-most-looking-forward-to-in-the-next-year-in-premiere/ https://www.provideocoalition.com/tool-tip-tuesday-for-adobe-premiere-pro-what-were-most-looking-forward-to-in-the-next-year-in-premiere/#respond Tue, 31 Dec 2024 15:19:18 +0000 https://www.provideocoalition.com/?p=287430 Read More... from Tool Tip Tuesday for Adobe Premiere Pro: What we’re most looking forward to in the next year in Premiere.

]]>
 

Tool Tip Tuesday for Adobe Premiere Pro: What we're most looking forward to in the next year in Premiere. 36

Welcome to Tooltip Tuesday for Adobe Premiere Pro on ProVideoCoalition.

Last week, Scott and Jeff showed what features they were most thankful for in the 2024 updates.

In this last post on New Year’s Eve, we are doubling up again on what we’re most excited about and looking forward to in the next year.

Much of the Adobe future is plotted out in their public betas available through the Creative Cloud application. You can install these concurrently with your existing version of Adobe apps. Both of us are updating the betas of Premiere Pro (and Media Encoder, Audition, Photoshop, and After Effects) on a daily basis

Tool Tip Tuesday for Adobe Premiere Pro: What we're most looking forward to in the next year in Premiere. 37

You can find a list of the What’s New in Beta by clicking on the Laboratory Beaker icon in the top right of Premiere Pro.

What's new in Beta

We don’t feel these versions are safe for production environments but rather allow you to play with the new tools early.

What is Jeff looking forward to?

This was a very hard choice for me.

I’m a hundred percent going to cheat and tell you that my list of what I’m excited about includes key functional Improvements (multi-threaded audio conforming, Open Timeline Import and Export (OTIO), and the New Color management) along with the potential of AI features (The new search panel with Visual Search, Generative Clip Extension and Smart Masking.)

But I have to pick one – and my criteria is to pick the one that I think will make the greatest difference in everyday editing. (This is my rule – Scott may pick a feature for different reasons!)

If Adobe can nail it, it’s going to be Generative Extend.

There are tricks we’ve learned when the footage is just too short. Generative Extend can add up to 2 seconds at the front and end of a clip. Any clip.

It could be a moment before an actor speaks. It could be just extending that pan just a little longer. It could be taking that small snippet of room tone and extending it. Generative Extend will handle audio with some key restrictions about music and speech.

This is the one feature I see myself using in every edit to get the timing of moments right to give me that extra beat that the source footage doesn’t have. Whether it’s for narrative or documentary work, I can’t wait to make this part of my everyday editorial.

Left image: original end frame, right image: extended "perfect" end frame.

Nevertheless, every single one of the features I mentioned above is crazy useful. This is the one that I keep finding myself wanting to go back to old edits to make adjustments.

The caveat will be how clients/audiences will feel knowing that the material will have some AI-generated content and the authenticity of editorial intention. But that’s a topic for a different discussion.

What is Scott looking forward to?

While I have to agree with Jeff that Generative Extend is a pretty amazing offering coming in the next year (Did you read about my Burning Questions about Generative Extend?), the particular item that I’m looking forward to is rather small in execution but large in functionality. I’m really looking forward to dynamic waveforms making their appearance in the shipping version of Premiere. Dynamic waveforms are now available in the Premiere beta, and they give you real-time feedback on the waveform scaling of your audio clips in the Premiere timeline.

This makes the most sense when you see them in action.

Tool Tip Tuesday for Adobe Premiere Pro: What we're most looking forward to in the next year in Premiere. 38

As you see in the gif above, as you adjust audio keyframe levels, the waveforms will scale up or down to reflect the volume change that you’ve made. And this doesn’t work with just “rubber band” key-framing. Dynamic waveforms are reflected in most places that you can change the volume in Premiere. Such as, the Essential Sound Panel, Effects Controls, the New Properties panel or applying clip-level audio gain. It’s simple, useful, very responsive and will be a much-welcome addition to Adobe Premiere Pro when it ships to the full release in 2025.

This series is courtesy of Adobe. 

]]>
https://www.provideocoalition.com/tool-tip-tuesday-for-adobe-premiere-pro-what-were-most-looking-forward-to-in-the-next-year-in-premiere/feed/ 0
Art of the Frame Podcast: Editors on Editing with “Dune: Part 2” Editor Joe Walker https://www.provideocoalition.com/art-of-the-frame-podcast-editors-on-editing-with-dune-part-2-editor-joe-walker/ https://www.provideocoalition.com/art-of-the-frame-podcast-editors-on-editing-with-dune-part-2-editor-joe-walker/#respond Mon, 30 Dec 2024 17:36:48 +0000 https://www.provideocoalition.com/?p=287437 Read More... from Art of the Frame Podcast: Editors on Editing with “Dune: Part 2” Editor Joe Walker

]]>
In this episode of Editors on Editing, Glenn Garland interviews Joe Walker to discuss the intricate editing process behind “Dune: Part Two,” highlighting Walker’s collaboration with director Denis Villeneuve. They discuss the technical aspects of editing but also emphasize the importance of storytelling, character depth, and visual and auditory dynamics in film. It’s a point that Walker calls out as being especially important when it comes to preserving emotional weight through sustained shots that allows audiences to meaningfully connect with the characters.

“You can have the best battle ever, but if you don’t care about the characters, you’ve got nothing.”

The Art of the Frame podcast is available on Apple Podcasts, Spotify, Google Podcasts, Anchor and many more platforms. If you like the podcast, make sure to subscribe so you don’t miss future episodes and, please leave a review so more people can find our show!

Support this podcast: https://podcasters.spotify.com/pod/show/artofthecut/support

]]>
https://www.provideocoalition.com/art-of-the-frame-podcast-editors-on-editing-with-dune-part-2-editor-joe-walker/feed/ 0
How Princess Cruises Found Success in the Remote Work Revolution https://www.provideocoalition.com/how-princess-cruises-found-success-in-the-remote-work-revolution/ https://www.provideocoalition.com/how-princess-cruises-found-success-in-the-remote-work-revolution/#respond Mon, 30 Dec 2024 13:00:53 +0000 https://www.provideocoalition.com/?p=287315 Read More... from How Princess Cruises Found Success in the Remote Work Revolution

]]>
How Princess Cruises Found Success in the Remote Work Revolution 41

Taking a cruise isn’t necessarily synonymous with adventure travel. But the Princess Cruises trips to Alaska open the door to the kinds of adventures that only they can provide—adventures that allow so many people to create their own unique stories.

To create our latest customer story, we accompanied the video team that produced the “once in a decade” marketing content for their Alaska sea and land packages. Their small and intrepid team traveled to the remotest of locations with seven cameras, captured 15TB of footage, and worked with a globally distributed post-production crew and stakeholders to deliver what their CMO declared “a home run.”

In this installment of Made in Frame, we meet Senior Manager of Media Production Scott Martin and editor Kristin Rogers, who took us through their journey to Denali National Park and beyond, as they explored the far reaches of Alaska—and the features in the all-new Frame.io—to create memorable results.

Cruising altitude

With a fleet of 18 ships that serve 330 cruise destinations across more than 100 countries on all seven continents, Princess Cruises, owned by Carnival Corporation (the largest cruise operator worldwide), are ranked #1 for Alaskan adventures for bringing the most people to the region. After 55 years of operating in the territory, Princess not only understands how to navigate the waters of Alaska, they also provide travel experiences on land by train, glacier trips by air, and five inland lodge destinations.

How Princess Cruises Found Success in the Remote Work Revolution 42
Princess Cruises provides travel experiences that include glacier trips by air in Alaska.

Still, Princess knows that they have to consistently deliver the kind of eye-popping content that ensures their continued success. And they have exactly the right team to do it. More specifically, they’ve been able to craft the perfect team, irrespective of location, to deliver their most ambitious shoot to date—with the help of Frame.io and their all-Adobe creative workflow.

A 15-year dream

Led by Scott Martin, this project represents an ambitious rebranding. Designed to humanize The Great Land for people who haven’t personally experienced it by spotlighting authentic, emotionally resonant moments, as well as highlighting the magnificent landscapes that only Princess provides access to, a shoot of this magnitude comes but once in a decade. Deliverables including brand new video and photography assets required them to capture extensive B-roll, deploy A and B photo teams, a drone team, and a team capturing product content.

How Princess Cruises Found Success in the Remote Work Revolution 43

The project is the culmination of a 15-year dream for Scott, who has specialized in making films in remote locations since the beginning of his career. What makes that pursuit so challenging for most people is what fuels his desire to take big creative swings. “When I was in my twenties and I was spending a lot of time in Alaska, I learned valuable filmmaking lessons from Alaskans because they just do more with less. There’s a tenacity. There’s a toughness that I think we, as filmmakers, can learn from,” he says.

One of the most important images for the team to capture was Denali at sunset and sunrise. Seeing it is the reason so many people travel to Alaska and, for Princess, including it in the campaign was a must. “The most challenging thing about Denali is she’s only visible 30 percent of the time. So 70 percent of the time, that gorgeous mountain is unfilmable,” Scott says. “What I heard the most was that this was going to be impossible, and for our team, that’s the strongest motivator. As soon as I heard the word ‘impossible’ I knew we were going to be standing at Denali, mile 62, sunrise and sunset with beautiful light. And we were going to nail it.”

How Princess Cruises Found Success in the Remote Work Revolution 44
Denali delivers the money shots.

Which, of course, they did. But not without intense preparation, the confidence that comes with having spent more than 300 days over the course of a career in that environment, and a lot of luck with the weather.

“We planned and practiced meticulously and had an A, B, C, D, and E option for everything,” Scott says. “The number one thing that we could really lean on was that we had so much experience in Alaska. We knew Mother Nature was going to be a challenge, and we accepted that challenge. I think that’s what’s beautiful about filming in Alaska. If you want control, don’t film in Alaska. If you want the idea of being ready at all times, or if you want a filmmaking experience that’s ripe with adversity and changes and with being on your toes all day, then Alaska is the place you want to go.”

Charting the course

But even more importantly than where they were going was perhaps why they were going. “We’ve accumulated a lot of trust over the years and have a track record of going into very remote environments with small teams, but it’s still important to make sure the stakeholders feel that it’s a strong investment,” Scott states.

For a shoot of this importance, the team needed to make sure they were as creatively prepared and aligned as they were logistically. In addition to stakeholders, they had four staff members along with approximately 35 contractors to support the production component. Not to mention the dispersed nature of the team—with members from Colorado and Texas to London and beyond. It almost goes without saying that Frame.io was vital to keeping everyone in the loop from the very beginning of pre-production.

With a roughly 100-day runway to prepare, the team needed to work with producers and stakeholders to make sure that everybody was on the same page. “Those 100 days are focused on aligning the purpose and objectives of the shoot. We spent a tremendous amount of time on that before we got into the fun, creative aspect of it,” Scott says.

How Princess Cruises Found Success in the Remote Work Revolution 45
Frame.io functions as a centralized platform for all assets and information from pre-production through delivery.

It’s why Frame.io is so critical to pre-production. “We have to align on our purpose in pre-production versus in post-production. That was something I learned too late as a filmmaker. The stories we’re telling today are largely leveraging a 10- or 15-year asset library. Frame.io really helps us understand at the beginning exactly what we have so we know exactly what we need to go get,” Scott adds.

It’s also why the team is fully invested in Adobe Creative Cloud, according to Scott. “I use Photoshop heavily in pre-production because I spend a lot of time putting screengrabs together or storyboards for our DP’s, producers, and crews. Often, especially when I’m using an image from a very specific location, there might be something that will detract from what the creative will see. So I’ll spend time in Photoshop erasing things because I want them focused on something very granularly,” he says.

Another Creative Cloud tool that the team relies on during pre-production is Adobe InDesign. Marketing Creative Director Dani Bartov uses it extensively to “build inspiration boards and visual reference for everything from action and framing for a given scene to wardrobe, hair, and makeup,” she states.

Momentum is fire

They set out in July, when the days were long and the opportunity to capture Denali at both sunrise and sunset was packed into an inconvenient window from midnight to 4:00 am. But that didn’t deter Scott.

The team was equipped with seven cameras, including an Alexa 35, a helicopter outfitted with a RED Raptor, and an 8mm film camera. “They ended up shooting about 15TB of footage, and we were actually able to deliver a select reel 24 hours after wrapping, which is pretty incredible,” Kristin says.

Most importantly, when Scott is out in the wild he needs a way to not just collaborate closely with Kristin and stakeholders, he needs to know what he’s got in the can and what he still needs—while the clock is ticking.

How Princess Cruises Found Success in the Remote Work Revolution 46
Scott’s selects help Kristin work more efficiently when he’s out on location.

“Scott and I work normal daytime hours when we’re editing other projects for Princess,” Kristin says. “But here he’s out shooting for sometimes 18 hours a day. I know this because I’d get notifications from Frame.io at 3:00 am [when she was working from Austin during week one]. They have so much going on—drone teams that are driving almost 3000 miles—and he’s managing all of that. With Frame.io, we’re able to communicate much more effectively, because even though he’s shooting in remote places, he has his iPad or his phone. He can go on Frame while he’s at Denali and mark his selects and send them to me. We could then sit down at breakfast or coffee or dinner [when she was in Alaska for week two], and I would get everything that I needed from him based on a quick chat and on his notes in Frame.”

For Scott, being able to review dailies in Frame.io was not just about helping him feel confident about what he’d captured or to pivot if he hadn’t yet captured it—it was more about helping his team collaborate faster and more effectively.

How Princess Cruises Found Success in the Remote Work Revolution 47
Scott’s able to access his dailies so he can start envisioning the edit while he’s still shooting.

“My perception of what happened while I was directing is often very different from what’s in the footage. What Frame.io does is it basically gives me access to the most up-to-date information, so as I’m directing, I’m building the edit in my head with the comments that are going into Frame. The transition to watching dailies in Frame.io and commenting, and all of that communication going out to our editors immediately, was life changing for our collaboration velocity,” Scott says. “Because I think in creativity, we have to remind ourselves that motivation is a spark and momentum is fire.”

Delivering the impossible

As it happened, that momentum helped the team pull off what seemed, at best, improbable, and at worst, impossible.

Kristin recounts the story. “I was doing a radio edit and Scott had just gotten back from Camp Denali and immediately received an urgent email—‘Can you build a select reel for Princess’s meeting?’” (Scheduled for a mere 36 hours after the shoot wrapped.)

How Princess Cruises Found Success in the Remote Work Revolution 48
An urgent, but polite, request.

The leaders making the request naturally understood the magnitude of the ask and put no undue pressure on the team. Which made the fact that the team came through such a pleasant surprise for everyone.

“We had never delivered anything that quickly in our entire production history,” Scott says. “But it became possible because we had Frame.io. We were able to let Kristin see 14 days of footage very quickly and were able to communicate at speeds that we have never been able to before.”

In the pre-Frame.io days, Scott figures it might have taken a week or more to accommodate that request. We often talk about how much money or time Frame.io saves customers…but can you put a price on feedback like this?

How Princess Cruises Found Success in the Remote Work Revolution 49
And the delighted reaction of the Princess execs.

Not to mention that using the Adobe creative workflow enables other types of fast turnarounds.

“Sometimes things don’t always go as planned when you’re on a production,” Kristin says. “For example, the drone team was trying to capture this epic shot of a train passing with the Princess ship, but the timing was off between the ship and the train. They had a beautiful ship shot, but no train. And they had a great train shot with no ship. What we’re able to do with Frame and Adobe was that I was able to send Charles, our in-house VFX artist [based in LA], shots from Premiere Pro and then he was able to take them into After Effects and put them together seamlessly. Instead of having to wait for somebody at an external post house to put them together and send them back, he did it and was able to send it right back to me through Frame.”

There was also paint work required on those shots, and the anchored comments feature in Frame.io made it much easier for Scott to specifically show the VFX artist what needed to be fixed. “Scott was able to go to my edit, tag Charles and say, ‘Hey, can you replace this? I want this in the sky.’ And then Charles was able to understand what Scott was thinking at that moment, while he was in Alaska, and turn those shots around extremely quickly so I could drop them into my edit,” Kristin says.

How Princess Cruises Found Success in the Remote Work Revolution 50
The ability to send attachments with comments eliminates ambiguity.

The ability to send attachments with comments is something that Kristin immediately found value in. “I love it because sometimes, Scott, especially on this shoot, would upload a still frame or something that’s from a previous Alaska shoot that’s not in this particular footage, and say, ‘Maybe we should consider using this shot in addition to these shots,’ and then I know exactly where to pull it from,” she says.

An all-inclusive platform

In the days prior to Frame.io, Scott’s team had assets spread across multiple hard drives, RAIDs, NAS and cloud systems. Now, all their assets are centralized, along with their collaborators and stakeholders.

“A lot of internal and external folks use Frame.io to access our system,” Scott says. “So whether it’s an editor or a VFX artist or a producer or an outside agency, all of them are heading into Frame to get the information and the resources they need. We also have stakeholders based in the US, Canada, the UK, Australia, and Asia, and we’re constantly sending review links out all the time. The Enterprise features really allow us to collaborate with whoever we need to.”

For Scott and team, when there is a unified place where everything you need for your project is available—with proper safeguards, of course—the people involved with the process can focus more fully on the creative process.

The Enterprise features really allow us to collaborate with whoever we need to.

“The entire post-production process for me has drastically improved since implementing Frame.io for multiple reasons. A big one is the way that I communicate with clients, and the way that clients actually communicate with each other,” Kristin says. “If I post an edit for clients and, let’s say there’s 15 stakeholders making comments on one particular edit, they can work it out in the comments. And I appreciate that because I don’t have to go to each one of them individually. I can instead reply to all of them and get them to arrive at a decision.”

For Scott, it’s about unfettered creativity. “One of the things that large organizations struggle with is trying to put creativity in a nine-to-five box. I am an adamant believer that creativity is at its best when it’s flowing. But I also understand that it’s incredibly disrespectful to your collaborators to expect them to be online all the time. Where Frame becomes so special to me is I can communicate at my speed. Prior to Frame, I didn’t have a tool or software or technology that could work at the speed of our team’s imagination,” he says.

How Princess Cruises Found Success in the Remote Work Revolution 51
Frame.io lets you access and share your assets in one place.

“When we send review links out, our stakeholders and our collaborators can comment on that link 24 hours a day, seven days a week, 365 days a year. That is what builds content velocity. If I’m commenting on something in Mountain Time in Colorado, by the time our producer in Vancouver has gotten up, that’s the first thing he’s going to see. It’s also eliminating several communication channels, because prior to Frame.io, whether it was Teams chats or WhatsApp, I believe that destroyed the creative review process because as soon as you decentralize that process you’re essentially decentralizing your leadership and your creativity, as well.”

It’s easy to infer that Scott doesn’t often stay still. Which is another reason why Frame.io is important to his way of life. “Let’s say Kristen is in an edit and she delivers that Frame.io link to me. It doesn’t matter if I’m at my desk, if I’m at my kid’s jujitsu class, if I’m having a picnic, if I’m on a walk,” he says. “Frame.io is just as good on my phone or my iPad, so it doesn’t matter what platform you’re on. Frame.io is always at your fingertips.”

How Princess Cruises Found Success in the Remote Work Revolution 52
Wherever Scott goes, Frame.io goes with him.

Creating bespoke workflows

Ask most creatives and they’ll tell you that no two projects are the same. Sure, you always have to shoot, edit, and deliver something, and you’ll need to share work-in-progress for feedback and approvals.

But especially when you’re traveling to far-flung locations, shooting documentary-style interviews, and your B-roll is heavily dependent on weather conditions and access, having a workflow that helps you sort through a massive amount of footage and organize it into something manageable is critical.

“One of the features in Frame.io that I absolutely love is metadata, and I use it in all of my Adobe editing products,” Kristin says. “Frame has 32 standard metadata fields that you can utilize, but you can actually do customized ones, as well. You’re able to label something as a wide shot, a medium shot, or a closeup. Or, like with this particular shoot, they were using the Alexa 35 and some fantastic lenses that gave us some beautiful lens flares, and being able to actually mark which shots they were doing very specific light tricks with was incredible.”

How Princess Cruises Found Success in the Remote Work Revolution 53
Metadata helps Kristin organize the huge amount of footage so she can easily find what she needs.

Adding customized metadata fields to your media enables another new feature called Collections that lets you use those tags to create smart folders containing assets grouped by whatever criteria you assign. “With a standard file structure, you’d have to go through every single file. But with metadata, you can actually go to specific keywords and you can sort by them. Or you can sort by duration or by frame rate or anything else,” she says.

For Scott, using metadata gives his team a couple of key advantages. First, it’s the ability to customize statuses within the workflow. “Frame.io previously didn’t really map the way we think about our post-production process, whereas now Frame.io is a guide and a map that helps you understand who’s delivering what, and when they’re delivering it.”

But although, as Scott says, that increases your content velocity, it’s even more important when you’ve invested in an expensive shoot and want to ensure that you’ve found all the best bits for both the current project and future use, as well.

It’s what Scott refers to as the cost per asset. “You start to look at it more like stocks and portfolios, because you can see that the cost of capturing those assets can depreciate over time. The first time I went into the new version of Frame and I started playing around with the Collections, I got incredibly excited. Because suddenly I was looking at this vast Alaska library that needed to be distributed to lots of different people.”

A creativity solution

As a visionary creator, Scott feels energized by the possibilities that Frame.io’s features unlock. “I think right now is the best time to be a creative person. Being a filmmaker in 2024 is way different than it was 50 years ago or even 20, 10, or 5 years ago,” Scott says. “I believe Adobe is developing the best creative tools, and they’re the ones I want to use. I know they’re going to work, and when we’re in very remote environments, dependability and consistency are key.”

There’s also the fact that their entire workflow is based in Adobe Creative Cloud, something that’s important to Scott. “One of the benefits of having these tools under one big brand is they start to work together a lot better than having a lot of bespoke tools,” he says. “One of the biggest challenges is that we do a lot of manual tasks and work in a lot of different systems. So the more that you can reduce those manual tasks and have most of your team hanging out in the same spot, collaboration velocity is just going to continue to grow.” Notably, the broader Princess team is adding Workfront and AEM to their Adobe ecosystem to further streamline their production workflow.

How Princess Cruises Found Success in the Remote Work Revolution 54
Whether at her office or in the field, Kristin can rely on Adobe Premiere Pro.

Kristin is likewise an Adobe devotee. “The way I look at editing platforms is that it’s like a palette of tools. I’ve used every editing platform, and I’ve been using Premiere for about 20 years. I do a lot of documentary work and for me it’s the most seamless platform. I actually do a lot of my own mixing in Premiere, even if I have 20 to 30 audio tracks,” she says. “Same thing with After Effects. I love that you can right click on your timeline and you can seamlessly go into After Effects. You can bring in Photoshop documents, and you can choose whether you want to merge them or not. Everything just works so well together.”

How Princess Cruises Found Success in the Remote Work Revolution 55
Bringing the photography and video workflows together makes it easier for the teams to align creatively.

What’s also exciting for Scott is that the Adobe environment is helping to bring video and photography workflows together. “I’d hang out with my video team in Frame.io, and I’d hang out with the photo team somewhere else, which naturally creates a division between those two teams, whether it’s a physical barrier or a software barrier. It’s like, ‘That’s the photo team, and that’s the video team.’ That is not how I operate as a director. I want those teams to be in the same places,” Scott says.

“Now, the photo workflow is as cool as the video workflow. So we’re starting to bring in some of the photographers and retouchers, and they’re starting to hang out in the same watering hole as our video people. Now when I think about Frame.io, I don’t think about it as a video solution. It’s a creativity solution.”

When I think about Frame.io, I don’t think about it as a video solution. It’s a creativity solution.

Beyond that, Scott appreciates that working in an integrated environment gives him better visibility into what his creative collaborators are doing and a better understanding of their process. “I can now create a space in Frame.io for every lens on the pipeline, whether it be an editor, a stakeholder, a producer, a director, or a DP. I now have a space that’s only for that person, that makes them truly feel seen and heard. What Frame.io buys me is different lenses into the different perspectives of people that need the information to make excellent films.”

The impact of change

It almost goes without saying that Scott is a creative with an insatiable drive to create. Filmmaking is what makes him get up every day with a smile on his face, and the possibilities that Frame.io opens for him are part of what puts it there.

“It’s the ultimate team creative discipline and the best sets are the ones where world-class teammates are really working together,” Scott says. “Frame.io was the first piece of software or technology that worked at the speed of our team’s imagination. If you want to become a power user on day one, you can. If you want to learn at your pace, you can. If this is your first time using Frame.io, it’s fast, it’s intuitive, and it’s built for creatives.”

How Princess Cruises Found Success in the Remote Work Revolution 56
When the team is working collaboratively they achieve great results.

But then Scott becomes more introspective. “The three things we all have in common is what’s on our gravestone. We have the year we’re born, the year we died, and we have the dash in between. The dash represents the choices we make about how we live our lives,” he says. “In this industry, we’re asked to constantly evolve and change as creatives. Frame.io is a software platform that has constantly changed as I’ve changed. Adobe tools are proving that creativity is going to continue to be democratized, and creativity is not for the few. It’s for everybody.”

Change is a constant, but the images that Scott and his team capture endure as a reminder of the majesty of our world. Whether those images serve to inspire people to fill their lives by visiting these places and creating their own stories, or to give those of us who may never visit them a chance to see them through Scott’s lens, we at Frame.io feel honored to play a supporting role.

]]>
https://www.provideocoalition.com/how-princess-cruises-found-success-in-the-remote-work-revolution/feed/ 0