TIPS – ProVideo Coalition https://www.provideocoalition.com A Filmtools Company Wed, 28 Aug 2024 14:52:20 +0000 en-US hourly 1 https://wordpress.org/?v=6.3.5 https://www.provideocoalition.com/wp-content/uploads/cropped-PVC_Logo_2020-32x32.jpg TIPS – ProVideo Coalition https://www.provideocoalition.com 32 32 Lens good, crop factor bad https://www.provideocoalition.com/lens-good-crop-factor-bad/ https://www.provideocoalition.com/lens-good-crop-factor-bad/#respond Mon, 09 Sep 2024 13:00:21 +0000 https://www.provideocoalition.com/?p=244815 Read More... from Lens good, crop factor bad

]]>
Closeup of the large sensor on a Fujifilm GFX 100 medium format camera
Fujifilm’s GFX 100. Crop factors under one are possible, but nobody seems to use them, for some reason.

 

Let’s imagine what would happen if you walked into this coffee shop* and asked for a latte that was one point six times larger than a mug of mint tea you bought yesterday from a competing establishment. Even if you were a daily regular, the person behind the counter would be forgiven for frowning slightly, not necessarily because the answer is unclear – you’re a regular, remember – but because it’s such an abstruse way to ask for a venti.

Unfortunately, that’s what we’re doing when we use crop factors in discussion of focal lengths.

Crop factors we have known

The whole thing became popular when Kodak, Agfa, Konica and Fujifilm launched the Advanced Photo System in 1996. It wasn’t a terrible idea – an easier-to-handle, cassette-based film system which even supported some metadata – but it was really too late, quickly being eclipsed by then-emergent digital photography, and discontinued by the mid-2000s. It also seriously confused stills people by altering the field of view rendered by a lens of a particular focal length.

Most people are aware that a system of multiplication factors became popular so that people could quickly estimate what lens to use on an APS camera to create the same results they’d have enjoyed on 35mm. This holds into the modern world, where an APS-C digital stills camera, based broadly on the APS layout, renders an image with a field of view equivalent to that rendered by a lens 1.6 times longer on 35mm. A 50mm lens on an APS-C camera gives us the same view as an 80mm lens would on a 35mm camera – 1.6 times longer.

The mathematics works, and not only for stills. APS-C is very roughly the same sort of size as a Super-35mm film frame (APS-C is 16.7mm wide versus 18.6 for Super-35mm, though real world cameras which claim those sizes vary somewhat) and thus the multiplication is very roughly similar. The problem is that the multiplication factor we’re using to discuss focal lengths is based on a still photography format that has little or nothing to do with most motion picture equipment and isn’t nearly as familiar to most people as the behaviour of Super-35mm equipment in the first place.

Lens mount on the back of a Canon 24-105mm F4-7.1 IS STM lens.
An image of a fixed size comes out of this aperture. Bigger sensors see more of it. That’s all.

Cropped close

This doesn’t happen a lot on set. Anyone asking for the eighty-divided-by-one-point-six would rate a very confused look from the second assistant. Instead, this sort of terminology is a malady of internet forums and social media, and it leaks into filmmaking by way of recent graduates with more enthusiasm than experience. It’s not even very smart in the stills world, given that quality stills cameras are available in at least four broad categories of sensor size, and many modern photographers grow up on inexpensive APS-C cameras. In situations where experience on full-frame cameras in either the stills or moving-picture world (Alexa LF and the like notwithstanding) may be rare, using full-frame as a field of view reference is practically a cruelty to the beginner.

The final irony of this is that both the motion picture and stills worlds have described 50mm lenses as “normal,” despite the huge difference in framing between full-frame stills and typical motion picture cameras using such a lens. The idea here certainly isn’t that the 50mm lens matches human vision, which has a massively wide overall angle of view and a real world focal length in the single digit millimetres. Rather, on a monitor or cinema screen, the 50 doesn’t show any of the perspective compression of long focal lengths or the yawning perspective of very short focal lengths.

But if it’s a normal lens on both APS-C and full frame, we’re being erroneously informed that it’s a normal lens on vastly different sensors. That’s balderdash, and goes some way to suggest just how much imprecision there is in this sort of focal-length philosophising.

Canon EOS R camera with lens removed, revealing the sensor
Crop factor one.

In the end, the solution is for people – generally new entrants, for whom we probably all owe some sympathy at this slow time – to actually learn the equipment, so they can sensibly discuss focal length in the knowledge of what the camera is and how it will behave without resorting to middle-school mathematics. Otherwise, what we’re doing is to decide what lens we’d use on a full frame camera (as if we’d used one recently), remember the crop factor as it applies to the camera in use, do the multiplication, and to pick the nearest number out of the lens case. That’s insane.

So, let’s ensure there’s at least a virtual swear box everywhere there can possibly be one, and consign the phrase “crop factor” to the pit of ignominy in which it belongs.

* Your correspondent outlined this article in Jimmy’s Coffee on McCall Street in Toronto after having spent nearly forty-five minutes trying to find a coffee shop that wasn’t Tim Horton’s. No offence to Tim, but Jimmy’s oatmeal chocolate chip cookies well were worth the investment of shoe leather.

]]>
https://www.provideocoalition.com/lens-good-crop-factor-bad/feed/ 0
Bayer best? https://www.provideocoalition.com/bayer-best/ https://www.provideocoalition.com/bayer-best/#respond Mon, 26 Aug 2024 13:00:32 +0000 https://www.provideocoalition.com/?p=269652 Read More... from Bayer best?

]]>
Sensor of a Blackmagic Ursa Mini camera, showing spectral colour effects from its tiny features
This is a Blackmagic Ursa Mini’s sensor. The colour is not from the filter array, it’s from interference between the wavelength of the light and the fine pitch of features on the chip

Building colour images from red, green and blue is probably one of the most fundamental concepts of film and TV technology. Most people move quickly on to the slightly more awkward question of why there are three components, and are often told that it’s because we have eyes with red, green and blue-sensitive structures, and we’ve often built cameras which duplicate that approach.

The reason we’re discussing this is, in part, because of an interesting design from Image Algorithmics, which has proposed a sensor which it describes as “engineered like the retina.” That could refer to a lot of things, but here it refers to IA’s choice of filters it calls “long,” “medium” and “short,” diverging from Bryce Bayer’s RGB filters. The design is interesting, because that’s the terminology often used to describe how human eyes really work in practice. There isn’t really red, green and blue; there’s yellowish, greenish and blueish, and none of them are deep colours.

Human colour

A quick glance at the data makes it clear just how unsaturated those sensitivities really are in human eyes. It’d be easy to assume that the humans might struggle to see saturated colours in general, and red in particular. The sensitivity curves are so wide that light of that colour might just look like a pale green and an equally powdery and faded yellow, and the yellow and green overlap enormously. In practice, the human visual system detects red by (in effect) subtracting the green from the yellow, a biological implementation of the matrix operations we see in some electronic cameras.

When Bayer was doing his work in the 1970s, it might have been possible to build a sensor with long, medium and short-wavelength sensitive filters that match the human eye. What might have been trickier would have been the resulting need for compact and power-frugal electronics capable to turn the output of such a sensor into a usable image. So, Bayer took the direct route, with red, green and blue filters which nicely complemented the red, green and blue output of display devices. Modern Bayer cameras use complex processing, but early examples were often fairly straightforward and mostly worked reasonably well.

With modern processing it works even better, so the question might be what Image Algorithmics expects to gain from the new filter array. The simple answer is that less saturated filters pass more light, potentially enhancing noise, sensitivity, dynamic range, or some combination thereof. Image Algorithmics proposes a sensor with 50%, yellow, 37.5% green, and 12.5% blue subpixels, which approximates the scattering of long, medium and short-sensitive cone cells across the human retina.

Image compares the random scatter of colour-sensitive cells in the human retina to the diagonally ordered pattern in Image Algorithmics' filte array design
Image Algorithmics’ colour filter array comparing conventional Bayer filters and the configuration of the eye. Drawing courtesy the company

Existing ideas

This is not entirely new; Sony used an emerald-sensitive pixel (which sort of looks cyan in most schematics) on the Cyber-shot DSC-F828 as early as 2003, while Kodak used cyan, magenta and yellow filters in the Nikon F5-based DCS 620x and 720x around the turn of the millennium. Huawei has made cameras made cameras in which the green element of a Bayer matrix is replaced with yellow. The Blackmagic Ursa Mini 12K uses a sensor with red, green, blue and unfiltered photosites, presumably yielding gains which are very relevant to such a densely-packed sensor.

Other approaches have also been explored. Kodak’s cyan, magenta and yellow sensor, using secondary colours, allows fully double the light through the filter layers, though the mathematical processing required often means turning up the saturation quite a bit, which can introduce noise of its own. The differing sensitivity of the sensor to cyan, magenta and yellow light can also offset some of the improvement. IA itself voices caution about Huawei’s red-blue-yellow design, which encounters some odd mathematical issues (which are a bit outside the scope of this article) around using red filters to approximate the human response to red light.

Scarves of various bright colours knotted around a rail, hanging in a row
Dyes are a common source of – er – interesting results in digital imaging, since the very deep colours, which don’t exist in nature, can end up reflecting a strange spectrum of light.

The inevitable compromise

Suffice to say that in general, no matter what combination of colours is used, there will be a choice to make, often between brightness noise, colour noise, or sensitivity and dynamic range. For complicated reasons, colour noise is easier to fix than brightness noise, and it’s mainly that idea which has led IA to the green-blue-yellow layout it favours here.

The company suggests that the design should achieve a 4.25dB signal to noise ratio advantage over RGB “in bright light,” and perhaps a bit more than that in lower light. That may not seem astounding, although the company promises us a similar improvement in dynamic range, with a total improvement of more than a stop. Encouraging as that is, we should be clear that this is an idea, without even a demonstration sensor having been made, and it’s clearly some time from a film set near you.

What really matters is not this particular design; alternative filter arrays have been tried before. Given the overwhelming majority of cameras still use Bayer sensors, we might reasonably conclude that the results of previous experiments have not been overwhelmingly positive. Cinematographers are also a cautious bunch, and anyone proposing anything as (comparatively) outlandish as an LMS sensor might need a strategy to handle the caution as well as the technology itself – but if some sort of alternative to Bayer can be made to work, then it’s hard to object.

]]>
https://www.provideocoalition.com/bayer-best/feed/ 0
Useful Tools for Editors – Total Solar Eclipse Edition https://www.provideocoalition.com/useful-tools-for-editors-total-solar-eclipse-edition/ https://www.provideocoalition.com/useful-tools-for-editors-total-solar-eclipse-edition/#respond Mon, 01 Apr 2024 00:49:16 +0000 https://www.provideocoalition.com/?p=278118 Read More... from Useful Tools for Editors – Total Solar Eclipse Edition

]]>
Here in the United States we’re about to see another total solar eclipse for those lucky enough to live toward much of the midwest of the US. Yep there will be a lot of travel to that part of the country so the hope is for good weather across the eclipe’s path. I live in the totality path of the 2017 eclipse and rit eally was an amazing experience up on top of a big hill in the city.

Useful Tools for Editors - Total Solar Eclipse Edition 8

So what better reason to have a brand new set of Useful Tools for Editors than the upcoming eclipse? I’m sure you’ll get a total solar eclipse somewhere near you someday.

Date Up!

My favorite useful tools are simple little utilities by independent developers which might simplify a tedious task. Date Up! from Ulti.Media is a very affordable app that “renames the files you drag on it by adding to the name a suffix with the date and time (of creation or of dragging)” in an effort to help the file name challenged (✋) keep track of the media a little better.

Useful Tools for Editors - Total Solar Eclipse Edition 9
Want to add the current date or the creation date to your files? Drag a batch into the proper well and instant change. How about that for useful?

And there are many parameters you can setup for how you would like the files to be renamed.

Useful Tools for Editors - Total Solar Eclipse Edition 10

Yes, you can probably do this with the file renaming app you might already have, and yes, you have to be careful when changing the original file names of some media, but if you know what you’re doing, DateUp! is very, very useful.

Ulti.Media Converter 2

And speaking of Ulti.Media, they also have a very useful file conversion utility called Converter 2. This drag/drop tool has been around for a while now and is “the Swiss army knife for transcoding, workflow and multimedia file management.” Converter can really hit a lot of your media conversion needs with support for video, audio and still images.

Useful Tools for Editors - Total Solar Eclipse Edition 11

While the interface makes it simple to use with a drag and drop interface, the setup of the different operations can be very detailed with options of how to convert, name and handle any file you throw at it.

Useful Tools for Editors - Total Solar Eclipse Edition 12

There’s even a “Presets Bazar” where you can download and presumably share your own presets. There’s really a lot to dig into with Converter 2, including AI operations like subject extraction and upscaling to spend some time reading over everything it can do as well as the tutorials to learn more. Conveter is around $16 and has a free 7-day trial available.

OWC Drive Speed

For years and years we’ve tested the drive speed of hard drives connected to the desktop, now, thanks to Other World Computing we can do the same for SSD drives connected to an iPhone with OWC Drive Speed (App Store Link). This free iPhone app might come in handy since we live in a world now where you can plug an SSD into a new iPhone and shoot ProRes directly to it.

Useful Tools for Editors - Total Solar Eclipse Edition 13
Above is the progression of screens you’ll go through as you test a connected SSD for speed. All except the “file browser” in iOS where you select the drive itself.

The app is a bit tricky to use as you have to use the iOS file navigator to select a connected SSD manually. And you can choose a record test time from one minute all the way up to a hour. That’s some thorough testing! It would be nice if the app could automatically choose a connected SSD since if you have one connected to your iPhone you most likely want to shoot to that disk. But overall, you can’t really complain too much about a free and useful tool, so thanks, OWC!

EasyFind

If you’re working on a Mac and your Spotlight search doesn’t seem to be searching a large attached volume, you might need another tool for a deep search. This free utility called EasyFind has been my goto for awhile now. You choose the volume you want to search and your search operator and let it go to work.

Useful Tools for Editors - Total Solar Eclipse Edition 14
Here I searched a volume for all my files related to the recent Video Creators Virtual Summit.

EasyFind is lean and fast and very useful, all the things you want in a Useful Tools for Editors.

Sidus TC Sync public beta

I usually don’t post beta apps to my Useful Tools column, but the public beta of Deity Microphone’s new Sidus TC Sync tool might be of particular interest to those shooting multiple cameras with timecode, be it proper SMTPE timecode or audio timecode recorded to a camera’s audio channel. As someone who cuts a lot of multicam, I love tools that can help with sync, as the internal NLE syncing tools can’t always handle what you throw at them. (Did you see my audio waveform sync shootout?)

I threw my 3-camera with secondary sound and good timecode multicam test at the beta and this was the result without me doing any manual setup:

Useful Tools for Editors - Total Solar Eclipse Edition 15
Okay, but yes there are conflicting timecodes since this is a 3 camera shoot so I’m not exactly sure what this means.
Useful Tools for Editors - Total Solar Eclipse Edition 16
But here’s the result, and that’s not bad for me not doing anything except dragging a single folder into an app and clicking one button.

Try as I might I could not get Sidus TC Sync to place all the audio in the image above on the same track, which it should be. By manually assigning the sync groups I was able to get the C-camera onto one track. A perfect sync app would be able to see how the files are setup in their own folders in the OS and be smart enough to assign camera angles/groups from that. Once you’ve got your sync, just export a list to your NLE.

Useful Tools for Editors - Total Solar Eclipse Edition 17

If you think this app looks a lot like Tentacle Sync Studio then you’re correct. Almost a copy. Sidus TC Sync is available for free in this public beta for both Mac and Windows.

QCTools

This one is presented without comment. I saw a thread somewhere recently asking about “free QC (quality control) tools for broadcast,” and while many/most of them can be quite expensive, a link was provided to the free QCTools. It is open-source and comes from the folks behind MediaInfo and is described as “a software tool that helps users analyze and understand their digitized video files through use of audiovisual analytics and filtering.” The website lists a whole litany of features and filters that are too numerous to mention here, so click over and give it a read if something like this is of interest to you. The graph it gives you after analyzing a file sure does look impressive.

Useful Tools for Editors - Total Solar Eclipse Edition 18
That looks impressive if you know what you’re looking at.

MediaArea has quite a few video-focused open-source projects available. Their website says they worked on QCTools “in the past” so I’m not sure how recently it has been updated.

Gal Toolkit for Premiere Pro

In my last Useful Tools for Editors, it was dedicated to Premiere Pro and I talked about the new toolkit for that Premiere Gal has created. Here we are not even a year later and this toolkit is already at version 3. There’s currently a one year anniversary sale that runs until April 5th so you can get the toolkit for $87. There’s also a version for After Effects.

Useful Tools for Editors - Total Solar Eclipse Edition 19

Another reason I wanted to highlight this updated package is that I think it’s a great alternative for editors doing a lot of work in the social media space who don’t have the time or the willpower to constantly dig through what can be the endless time suck of template sites like Envato Elements or Motion Array. While those kinds of sites are amazing resources, you can spend hours upon hours trying to find the perfect asset and then even more hours second-guessing yourself and looking even more. Packages like the Premiere Gal Toolkit can provide a great selection of overlays, lower thirds, graphics and many other options without the insanity that is literally thousands upon thousands of what looks like variations of the same thing.

Adobe Acrobat tools

I was sent a bunch of multipage PDFs the other day to use in an edit. While looking for fast ways to convert every page or every PDF to JPEGs I discovered that Adobe Acrobat has a ton of tools for many different PDF related purposes. And what made this even better is that since I have an Adobe subscription these Acrobat tools were part of my subscription.

Useful Tools for Editors - Total Solar Eclipse Edition 20

I have a feeling there are other things that are part of my Adobe subscrption that I am unaware of so this was a nice discovery. Head to the Adobe Acrobat section of Adobe’s website and you can browe all the Acrobat tools, including combing multiple PDFs, splitting them, deleting pages and AI assisted PDF tools as well.

Links from Twitter:

]]>
https://www.provideocoalition.com/useful-tools-for-editors-total-solar-eclipse-edition/feed/ 0
What to do with all these photos and videos — Part 1 https://www.provideocoalition.com/what-to-do-with-all-these-photos-and-videos-part-1/ https://www.provideocoalition.com/what-to-do-with-all-these-photos-and-videos-part-1/#comments Mon, 30 Jan 2023 12:30:31 +0000 https://www.provideocoalition.com/?p=261995 Read More... from What to do with all these photos and videos — Part 1

]]>
Introduction

The ubiquity of smartphones means we are all documenting our time on Earth more thoroughly than ever before. If you’re in your 40s or older, you’ll probably remember when each photo cost actual money, and you had about 24 or 36 shots on a roll. Photos were rare, taken at special occasions, and presented in photo albums or slide projectors. And that’s just not what happens any more.

What to do with all these photos and videos — Part 1 21
On the Eiffel Tower, camcorder in hand, dodgy facial hair, shot on film — yes, this was a long time ago

With the move to digital, we’re all taking far more photos and videos than ever before, and there are huge consequences both professionally and personally. As videographers, photographers and editors, we’re aware of how to manage, back up and archive our clients’ work, but that’s not the whole story. We all document our own lives as well, and that’s what I want to focus on here. How do we handle the absolutely monstrous cache of content we’re probably creating of and for our families in the long term? 

Personal content vs professional content

Before digging in, I’d like to draw a line between content we make for our clients, and that which we make for ourselves. Professional content, by definition, likely has a limited lifespan. As processes and products change, content becomes outdated, and a video made a few years prior has limited use. Of course, companies can and should keep a historical archive of their photo and video material, but it’s not likely to be your job. Of course, you may choose to archive a few months or years of client work to keep the client happy, but in general, clients want the finished product, not the source footage.

What to do with all these photos and videos — Part 1 22
Venice was busy 20 years ago too

Personal content, capturing our families as they grow, change, and pass, is a different beast entirely. It’s an ever-growing cache of irreplaceable content which most of us are loath to delete, because we can’t go back in time and recapture those moments. Some of it might end up being useful in a professional context, too; I’ve shot vacation video in Venice, Italy that ended up in both my book on Final Cut Pro and in questions on the official Apple Certified Final Cut Pro Exams. But the most precious video I have is of my father before he passed away, many years ago now. We forget how rare video once was, and this is the only record I have of his voice, smile and presence. It’s worth more than gold.

Finding the balance

We have to remember that even though we like shooting video and/or stills, (and we’re probably pretty good at it) there’s a limit to how much video we can reasonably shoot on a family holiday. If you’re in an amazing place to hang out with your family, the focus should probably be on your family, not on setting up a fantastic timelapse or waiting for the perfect sunset snap. Take a few nice shots, but grab memories first. Snapshots, not masterpieces.

What to do with all these photos and videos — Part 1 23
None of these shots are spectacular, but they’re a fantastic record of that one time my wife and I took a private speedboat and landed on a beach at a tiny resort and OMG THAT WAS AMAZING

Of course, you’re not going to be the only one taking photos. Everyone has phones, and most people will reach reflexively for theirs to capture snapshots. As most phones shoot at only 12MP, they’ll look good on a phone-sized screen, and at a smaller size on a larger screen, but they’re not going to look amazing on a larger screen or as a larger printout. That’s not to say they’re not valuable — everyone’s perspective has value, and not every selfie is to be despised — but if you want to capture some part of some amazing place, don’t be afraid to spend a little time with better equipment to capture it well.

What to do with all these photos and videos — Part 1 24
No, it was not worth lugging a heavy camera on a mountain walk for a few pretty shots nobody’s ever seen

But remember balance. The capture process usually means you end up looking at screens rather than the place, food or thing you’re actually there to experience, and I think most of us are taking too many photos we barely look back at. We forget to look at the mountain once we’ve taken a picture of the mountain, and then we barely look at the picture. That goes double for video, which takes more effort to shoot and to share — I can say for certain that it wasn’t worth lugging an original Blackmagic Cinema Camera up and down mountains in Japan.

So how do we do it right?

Capture the moment without ruining the moment

Some of the best things you’ll see on holiday can’t be captured in a standard photo, so mix it up. Sure, grab a few snaps with your phone. The latest iPhone 14 Pro lets you capture 48MP shots, which does help to bridge the gap between phones and “real” cameras, and it’s always going to be the quick, convenient option. Shoot in ProRaw if you’re happy to spend 100MB per shot, or use a third party app like Halide to capture compressed HEIC shots at a much smaller size.

What to do with all these photos and videos — Part 1 25
10-20MB of space for a 48MP shot means I can shoot at will

Remember that only the main lens is high-res, so sacrifice resolution and mix in snapshots on the 0.5 and 3x lenses. Switch to the regular Camera app for some panoramas, and to capture live photos (if you like the little videos it makes). Grab regular videos too — no need for them to be long, or perfect. Videos tend to be a better trigger for memories, so be sure to grab a few even if they’re not perfect and you like the stills better.

Holidays in spectacular places are also a great place to bring a 360° camera, to capture both stills and videos. Of the two, the stills will look better and will be easier to share, so don’t go crazy with the 360° video unless you plan to reframe it later. The obvious joy of a 360° image is that it captures the entire world around you, but the less obvious joy is that you don’t get lost in the screen while you’re using it. Turn it on, push the button, extend the selfie stick a bit, wait for the 3 second timer to trigger, smile while you look where you are, you’re done. No looking at a screen, no time spent framing.

What to do with all these photos and videos — Part 1 26
This 360° equirectangular shot shows the photographer hiding behind a tree

So should you take a real camera? Sure, if it’s worth spending a little more time capturing a better image or a better video. If you bring multiple lenses you’ll have wider and telephoto options, but always stay conscious of how much your gear impacts your time away. There’s no travel tripod light enough that I want to carry on a three hour hike, but find your own balance. From bitter experience, please make sure that your camera body and at least one of your lenses is weather sealed. Cheap, light travel zooms are tempting, but if they’re allergic to drizzle, you could end up carrying them more than using them.

So you’ve captured the world around you and your family within it. What next?

Importing, then culling

One of the most important jobs a photographer or video editor has is to present the best of the source material, no matter how many or how few shots you capture. A client who insists on receiving every photograph taken (and they do exist!) isn’t getting a full service, because editing is an important part of the process. Some clients need the best 1%, some need more, but most creators shoot far more than they ever deliver.

You’ve got plenty of choices when it comes to digital asset managers, though they tend to be far more focused on photos than videos. If you like Apple’s Live Photos feature, you’re more or less stuck with Apple’s own Photos app — more capable than it looks, but fairly limited compared to some others. Lightroom Classic  is excellent, but subscription-based, which could be a problem in the long term.

There are many other potential choices in this space, including Luminar Neo, RAW Power, ACDSee  and others, and it’s worth exploring your options. Some apps like to import photos and manage them, others prefer to reference files sitting loose in folders, and some apps let you choose.

What to do with all these photos and videos — Part 1 27
RAW Power is a little-known but quite powerful photo management app — worth a look

Whichever way you import, remember to back up — it’s easiest to simply not erase your memory cards until you’ve been able to duplicate your image library once you’re home. If you don’t want to take a laptop on vacation, remember that an iPad Pro can connect to a USB-C hub, allowing you to copy photos direct to an external SSD while you’re away.

After importing, culling the inputs makes the outputs far more palatable to anyone you plan to share your photos and videos with, and it’s critical. You’ll need to import, then tag the good, hide the bad, and delete the awful. Be brutal to make the process successful, and only pick photos that stand alone, not those that remind you of a private story.

Remember, culling means you’re presenting your best work, and making sure you don’t bore people who can only spare 2, 5 or 10 minutes. If, years from now, a family member says “remember that day when we…” you’ve got the originals and you can reminisce together as much as you like.

What to do with all these photos and videos — Part 1 28
Right, right, right, period, right, period, right, right, right…

My usual culling method is to cycle through photos with the arrow keys, marking the best photos as Favorites in Photos (by pressing the period key) or four or five stars (with the number keys) in Lightroom, working quickly. With the best ones picked, make an album or smart album containing just those great photos — there’s a Favorites collection in Photos, and you can search for 4+ star ratings in Lightroom.

If there are any total duds (out of focus, shaky, duplicates) then feel free to delete them, but don’t spend too long on this. Trust your gut, pick the good stuff. If you want to take this to the next level, take that first collection of “good” photos and cull them further to a smaller collection of the “very best” photos.

What about videos? Note that many photo management apps do allow you to store and show videos as well as photos, but options are pretty limited. If you’re like me, you’ll probably prefer to use a dedicated, more capable app both to pick the best parts of clips, and later to edit them together. Final Cut Pro does a great job at letting you mark the best parts of any clip — I to mark In, O to mark Out, F to Favorite. When you’re done, view Favorites at the top of the Browser. Premiere Pro lets you mark just one range on a clip, but you can use create subclips or a stringout timeline to record the best parts.

Conclusion

In photos and videos, then: shoot as you wish, but picking the best bits is the most important part. You can color correct them if they need it, or you really want to, but friends probably won’t care — it’s not a professional shoot. Feel free to fix any glaring issues, but you’re dealing in bulk, and only the very best hero shots deserve a full color pass.

Next? You’ll want to be able to share some of your photos and videos with friends, and also be able to look back on everything yourself in years to come. All of that is coming in Part 2, where we’ll look at how to present your freshly culled images to people who care, and how to find a specific image years in the future. Happy culling, and see you back here soon. 

]]>
https://www.provideocoalition.com/what-to-do-with-all-these-photos-and-videos-part-1/feed/ 1
Revisit 2022’s top articles, reviews, tips, opinions and more on ProVideo Coalition https://www.provideocoalition.com/provideo-coalition-top-content-2022/ https://www.provideocoalition.com/provideo-coalition-top-content-2022/#respond Fri, 30 Dec 2022 13:00:56 +0000 https://www.provideocoalition.com/?p=261152 Read More... from Revisit 2022’s top articles, reviews, tips, opinions and more on ProVideo Coalition

]]>
Revisit 2022's top articles, reviews, tips, opinions and more on ProVideo Coalition 57

2022 was a busy year on ProVideo Coalition. We published hundreds of pieces of content that covered the range of production, cameras, post-production and supporting gear. This includes news, reviews, podcasts, tricks, tips and opinion pieces.

Please enjoy the year that was on ProVideo Coalition with a list of some of the content from the past year broken down by category. These select pieces are listed from oldest to newest per category. Starting with what is consistently one of our most popular categories:

Reviews

Nowhere on the internet will you find a more comprehensive and depth set of reviews around gear and services in the media creation space. See all of our reviews here.

The New SIGMA 18-50mm F2.8 DC DN | Contemporary Review

Sigma 18-50mmSmall, lightweight, and swapped focus and zoom ring placement did not keep me from thoroughly enjoying the new Sigma 18-50 F2.8 DC DN | Contemporary zoom lens. The Sigma 18-50mm has a lot going for it. Budget-minded Mirrorless APS-C format shooters might find an all-around lens gem with the $549 Sigma 18-50mm. NOW, this is where I start showing example photos and footage, but for my life, I cannot find the photos I took with the Sigma 18-50mm anywhere in my catalog. For this, I am sorry. Am I the only one whose brain has turned to mush during this…read more

 

What is a “Cinema Camera”? A Canon C70 Review and Discussion

Revisit 2022's top articles, reviews, tips, opinions and more on ProVideo Coalition 58I’m not going to lie to you: I’ve been sitting on this review pretty much all year. If I had to distill the “why” down to a simple answer, it’s that the C70 is great. It’s just a great camera. You want one? Get one. It’s worth the money, and you won’t be disappointed. Even in 5 years I don’t see this being a camera left by the way-side. I certainly want one, even as a C500mkII owner. I also know plenty of people who traded their Komodos in for a C70, if that means anything, plus the C70 is actually available *cough* and currently holds LensRentals.com’s top most rented new camera and apparently wins by a mile. Chris Ray and I were chatting about it at the recent Filmtools Cineshred event, and he was saying it’s become his workhorse. He talked about it recently with Filmtools in regards to shooting his new documentary. In any case, we both agree that the C70 is a top-tier camera worthy of any filmmakers…read more

 

A five-word review of the DJI Mini 3 Pro

Revisit 2022's top articles, reviews, tips, opinions and more on ProVideo Coalition 59

Part review, part tip-sheet, this five-word review (Tiny, Image, Reliable, Extras, Price) is going to take you through the best drone shots that you need to capture in a live shoot — but it’s also a review of the DJI Mini 3 Pro, which came out just a few months ago. Is it good enough for professional needs? And aren’t there more than five types of drone shots you can capture? All will be…read more

 

Review: HP DreamColor 4K Z27xs G3 “junior” monitor for video grading & editing

Revisit 2022's top articles, reviews, tips, opinions and more on ProVideo Coalition 60Here is my review of the DreamColor Z27xs G3 “junior”(affordable) (MSRP US$674) based upon the review unit I received from HP. The term G3 means it’s the third generation of the affordable series. HP calls it: “a 4K color calibrated display ideal for photo and video work”. I clarify that this model is 4K UHD, the television version of 4K, with a 16:9 aspect ratio. In a nutshell, it’s much better than the Z24 G2 I reviewed in 2017, despite the removal of one particular feature which I appreciated (details later). Ahead I will tell you all the great things that I discovered, together with the single gotcha that is still the norm at this price point, from any manufacturer. I also clarify its proper integration for proper framerate/cadence and audio/video lip sync when monitoring, especially considering the removed…read more

 

Review: Mac Studio and the M1 Ultra for video editors – part 1

Mac Studio aboutFor diehard Mac users, the Mac Studio might be the ultimate machine for a lot of what we do in editing and post-production. It’s small, stylish, quiet, loaded with connectivity and very powerful. You can pull the Mac Studio out of the over-engineered box, plug it in, plug things into it and then go to work crunching through post-production tasks like never before. That is, of course, if your software properly supports the M1 Max or new M1 Ultra chips inside the Mac Studio. That’s when you’ll be crunching through tasks like never…read more

 

Initial Review: Camo Pro, the software CCU for your phone camera

Revisit 2022's top articles, reviews, tips, opinions and more on ProVideo Coalition 61If you have been involved in the engineering end of traditional TV studios and trucks, you already know what a CCU is. In case you haven’t, a CCU is a Camera Control Unit. Whether it’s connected to a camera via a multicore cable, triax or USB, a CCU provides instant remote control of various camera parameters much more conveniently than on the camera head itself. These parameters often include aperture (iris), master black level, master gain level (ISO), gain trim for RGB (red, green, blue), gain trim for RGB black level, shutter speed and more. Most CCUs are physical devices. Instead, Camo Pro is a powerful CCU app for Android and iOS with a matching Camo Studio software for macOS and Windows. Although Camo’s creator hasn’t yet called it a CCU, after using CCUs myself in TV studios since the 1980s, I can tell you: Camo Pro is indeed a CCU, albeit a unique one. Here is my…read more

 

Reviewing The New Canon XF605: A 4K Single-Lens Camcorder

Revisit 2022's top articles, reviews, tips, opinions and more on ProVideo Coalition 62Aimed at solo shooters and reality documentary-style shooting situations, the Canon XF605 is a feature-rich camera ready to be on the go with any production armed with a passport to far-flung sets. The Canon XF605 fits right in between Canon’s XF405 and XF705. Highlighted features found in the Canon XF605 included a built-in 15x optically-stabilized zoom, a 1-inch sensor with Canon’s wonderful Dual Pixel CMOS AF, and unlimited 4K video recording up to 60p. Add in a semi-truck load amount of in/out connectivity, and you have a very flexible 4K camera…read more

 

Tips and Tricks and Education

Tips and tricks are always a popular subject than range from the quick and easy to much more comprehensive.

Using NDI Tools with Premiere Pro for Zoom review meetings

NDI Tools

It has been estimated that virtual production and remote post production has advanced 5-10 years more than expected due to the pandemic. Many of us around the world continue to work remotely and for some it has become a new way of life. Some people have left the major editing cities like LA or NY or London for a quieter life and taken their clients with them. So there’s been a lot of discussion online about presenting your edits to…read more

 

Let’s Edit with Media Composer – Lesson 13 – Bin View Modes

Revisit 2022's top articles, reviews, tips, opinions and more on ProVideo Coalition 63Bins, as we know, are the centerpiece of your workflow(s).  They are what will make for a smooth edit, or a potential rocky one as well.  In this Let’s Edit lesson, we’re talking Bin View Modes.  Now, you might be thinking “I have no idea what that even means!”.  Well, Bin View Modes are simply Text, Frame and Script views, that we have been using since the inception of Media Composer.  However, there are been some updates to them that you might not know about, and they will definitely help keep you organized, no matter which one is your…read more

 

How to measure audio latency? 

Revisit 2022's top articles, reviews, tips, opinions and more on ProVideo Coalition 64In many workflow and review articles, I have discussed the benefits of “latency-free” monitoring when it comes to audio latency, so that the human operator/host can perceive no noticeable delay when monitoring her/himself for QC (quality control) and/or to hear other participants too. Most of the audio interfaces and some of the USB mics I have reviewed have a headphone jack for monitoring. Often these are called “latency-free monitoring” or “direct monitoring” by the manufacturer. This is possible thanks to a local circuit in the device which typically provides the signal without having to depend on software based monitoring on the host computer or device, which usually has annoying latency (delay). Some of the early USB mics I reviewed had no headphone output at all. Later, there was one which had an output but it was not direct and only offered what came from the host computer or device. Recently, I had a situation where an output called “direct” but had too much latency for comfort—and there was no host computer: It was in standalone mode. For the first time, I felt the need to measure and document how much latency there was in milliseconds. Ahead I’ll share my methodology to test and measure the delay, as well as the results of other researcher’s test about how much latency is tolerable by humans…read more

 

Practical AI: Spleeter – Music decomposition aka Unbaking a cake

Revisit 2022's top articles, reviews, tips, opinions and more on ProVideo Coalition 65This article about Spleeter is the first in a series here at PVC that’s going to focus on Artificial Intelligence / Machine Learning tools. There won’t be that high-level discussion of how AI works (or doesn’t). No discussion of corpus training. No mentions of Tensorflow or ML on a chip…read more

 

Creating futuristic UIs, HUDs, and languages for Netflix’s “The Mitchells vs. The Machines” with Adobe Creative Cloud

Revisit 2022's top articles, reviews, tips, opinions and more on ProVideo Coalition 66

From Iron Man’s futuristic HUD to holographic maps in a galaxy far, far away, Jayse Hansen is the designer behind some of the most recognizable displays and interfaces in films today. Created with Adobe Creative Cloud apps, his iconic interfaces combine realism with futuristic flare. Hansen is best known for his work on live-action blockbusters, but in recent years, has also started to find a niche in animated…read more

 

Opinion

Opinion pieces can generate a lot of discussion, especially when the word hate is in the title.

How to answer when someone asks you to move a project from Avid to Premiere Pro (or vice versa)

Revisit 2022's top articles, reviews, tips, opinions and more on ProVideo Coalition 67One question that I saw pop up on a lot of Internet forums and Facebook groups last year (and some already this year) had to do with moving “projects” from one non-linear editing system to another. Usually it was some version of moving Adobe Premiere Pro to Avid Media Composer or vice versa but sometimes Final Cut Pro was involved in the conversation. To a lesser extent Davinci Resolve. And believe it or not, once I saw VEGAS Pro in the conversation…read more

 

Why ProRes?

Revisit 2022's top articles, reviews, tips, opinions and more on ProVideo Coalition 68I remember seeing QuickTime 1.0 when I was a teenage nerd with a new Mac LC. That first version was simple enough — tiny postage-stamp videos of the birth of a child and the launch of an Apollo rocket could be copied and pasted to create something new. In the era of VHS-to-VHS dubs, this was revelatory. Fast forward a few years to DV, the iMac DV, then HDV, and the last twenty or so years have seen an explosion of codecs, cameras, formats and ever-increasing resolutions. So where does ProRes fit into the story, and what’s its place in today’s video world? With the help of Steve Bayes, Product Manager of ProRes at its launch and for a decade afterwards, let’s…read more

 

Low-resolution sensors are best!

Closeup of the sensor in a Blackmagic Ursa Mini 4.6K camera. The sensor surface appears as a rainbow-coloured rectangle.Let’s talk about some fundamentals, because, well, sometimes people don’t so much grab the wrong end of the stick as superglue the stick to their hands and march around proudly displaying it as if they’re being followed by a military band.

The idea that higher-resolution sensors have poorer noise, sensitivity and dynamic range performance is embedded in the brains of film and TV camera specialists like a carelessly-thrown tomahawk. Like all good logic bombs, it’s true in part, but there’s a deeper story about how real cameras actually work, and how much information about the scene we can wring out of the photons we…read more

 

Is the Apple Mac Studio really a PC killer?

Revisit 2022's top articles, reviews, tips, opinions and more on ProVideo Coalition 69If you watched the Apple launch event for the Mac Studio or saw some of the presentation slides, you’d certainly be forgiven for thinking that yes, the tiny little silver box is a PC killer, and quieter, smaller and more energy efficient to boot. I thought I’d take a bit of time to look into this claim, focusing mainly on Adobe Premiere Pro as that is what I use most in the day to day…read more

 

VideoPress vs Vimeo Pro/Vimeo Plus: a practical comparison for embedded video

Revisit 2022's top articles, reviews, tips, opinions and more on ProVideo Coalition 70Many video producers I know use either Vimeo Pro or Vimeo Plus as the source for the videos they embed on their own websites or on their clients’ websites. So far, nobody I know uses VideoPress —other than myself, for my own testing for this review, which is lower in price and offers similar —but not identical— features. Unlike YouTube, fortunately VideoPress, Vimeo Pro and Vimeo Plus offer an advertising-free place to upload videos and later embed them into your websites (or your clients’ websites). For those unfamiliar with VideoPress, it is a paid service from Automattic (with an intentional double t) which is also the creator and owner of both WordPress.com and WordPress.org. The latter is the source of the core open-source code for many self-hosted WordPress websites, which include ProVideoCoalition.com (where you’re reading this article) and most of the sites hosted at TecnoTur.us and CombinedHosting.com (which are my services). This review article compares many aspects of the two services: VideoPress versus VimeoPro/Vimeo Plus…read more

 

Common Mistakes in Online Video

Revisit 2022's top articles, reviews, tips, opinions and more on ProVideo Coalition 71

While I’m not yet old, today I am grumpy, and YouTube’s starting to annoy me: there are so many common mistakes in online video. Facebook’s algorithm pushes me a stream of unwanted garbage after every video I actually choose to see; TikTok is a colossal waste of everyone’s time, and even Meta knows that Instagram is bad for your mental health. While YouTube has its dark pits and poisoned rabbit holes, it’s large enough to offer thousands of solid videos worth your time, and even pay creators for…read more

 

Filmmakers don’t want film, they want what people think film is

Photo of a rocky, alpine shoreline taken at the beginning of a roll of film, where half the frame has been erased by exposure to light.If you put the film the other way around, there’s even less grain!So we’re twenty years in to the digital acquistion revolution, and somehow, despite being expensive, noisy, unstable, flickery, dim and covered in black dots representing the projectionist’s recently-shed dandruff, film is still the holy grail. That probably explains the gold-plated existence of things like VideoVillage’s FilmBox, a piece of film emulation software in which the company has enough confidence to charge up to $5,000 for a licence aimed at high-end users. Published images look very good, with faithful simulation of blooming, variable grain density with respect to image luminance, gate weave, and scanned particles of…read more

 

An open letter to Tim Cook about Final Cut Pro, signed by editors and post-production pros around the world

Revisit 2022's top articles, reviews, tips, opinions and more on ProVideo Coalition 72There is no doubt that Final Cut Pro has come a long way since it’s introduction many years ago. We’ve seen an architecture change with Libraries, multicam added, a new XML format, an interface redesign, machine learning features, in-app tracking, workflow extensions and even dropping the “X” for the name. Just looking over the release notes shows a long list of features, updates and bug fixes that goes back years. But the flip side of this is the argument that the Apple team working on Final Cut Pro is moving too slowly and not keeping up with competitors. It took over a decade to get the very basic feature of dupe detection. Rumor has it there is a Roles-based audio mixer somewhere in the FCP code but it hasn’t been turned on yet (who knows if that is even true or if it will be…read more

 

Podcasts

The PVC podcast networks continues to expand and our most popular series can be found on Anchor, Spotify, Apple Podcasts or in your favorite podcasting app.

Ron Dawson wrapped up his Crossing the 180 series ⬇

Crossing the 180 Season Finale: Solving Gender Disparity in HollywoodRevisit 2022's top articles, reviews, tips, opinions and more on ProVideo Coalition 73

In the season finale of “Crossing the 180,” Ron brings out of “mothballs,” clips for his previous critically acclaimed podcast, “Radio Film School.” In these clips, Ron speaks with seasoned film and Television producer Yolanda T. Cochran as well as “The Other 50%: Herstory” podcast host and Hollywood veteran, Julie Harris Walker. You’ll also hear soundbites from other luminaries in the worlds of Hollywood and tech. Altogether, this is a funny, provocative, and inspiring look at addressing the issue of gender disparity in…read more

 

Editors on Editing w/“Glass Onion: A Knives Out Mystery” Editor Bob Ducsay

Glass Onion: A Knives Out Mystery Editors on EditingOn this week’s episode of Editors on Editing, Glenn talks with Bob Ducsay about editing the highly anticipated film, Glass Onion: A Knives Out Mystery. Bob has edited such spectacles as the Mummy, GI Joe: The Rise of Cobra, Looper, Star Wars: The Last Jedi and Knives Out for which he was nominated for the…read more

 

Eric Koretz, DP of “Ozark” // Frame & Reference

Revisit 2022's top articles, reviews, tips, opinions and more on ProVideo Coalition 74Welcome back to the Frame & Reference podcast, this week Kenny talks with cinematographer Eric Koretz about shooting 4 episodes of the last season of “Ozark.” Eric is a graduate of AFI and in this episode goes into some of the nitty gritty of lighting on “Ozark” and…read more

A brand new roundtable podcast was launched ⬇

The Alan Smithee Round Table Podcast

Alan Smithee Round Table Podcast, new podcast from Art of the Frame and Provideo Coalition hosted by Scott simmons, katie hinsen and michael kammesWelcome to episode one of our monthly round table podcast where hosts Scott Simmons, Katie Hinsen & Michael Kammes talk about the latest news in production, post production, entertainment tech and beyond. In our inaugural episode the gang talks about the Blue Collar Post Collective rate survey, updates to the Metaverse and how it will effect the world of production, AI technology and a fun segment called One Cool Thing where each host talks about one cool thing they learned or found over the…read more

 

News

There was too much news in 2022 to summarize it here but here are a few news stories worth mentioning.

An update on Kyno and what might be in store for the future of this fantastic piece of post-production software

Kyno updatedFor years now I have professed my love for Kyno as an integral part of a post-production workflow. It doesn’t matter which NLE you use, Kyno can make the whole post-production process easier as it’s a workflow tool that can do too many things to list here. I first used (and covered Kyno) back in a 2016 Useful Tools for Editors and liked it so much I prompted PVC to do a Q and A with the developers. An update in 2019 put Kyno into another Useful Tools for Editors article and made me type this…read more

 

Lightworks 2022: there’s a Lightworks for everyone!Lightworks 2022: there's a Lightworks for everyone!

Lightworks 2022 is announced as the Lightworks for all content creators, the biggest release ever of the classic NLE, with new editing tools and features. In case you missed it, check the new version! With the first official introduction last November, Lightworks 2022 is now available in three flavors: Free, Creator and Pro. The pandemic and the end of the year may have contributed for many of those wanting to check the software not doing so, but you still have time. Packed with new features, the Lightworks 2022 reflects the drastic changes we’ve seen in content production in recent years and months..read more

 

LUMIX GH6: the tech behind the new Panasonic camera

LUMIX GH6: the tech behind the new cameraPanasonic has developed a new 25.2-megapixel Live MOS Sensor without LPF (Low-pass Filter) that boasts high resolution, high-speed signal readout that reduces rolling shutter problem and achieves wide dynamic range. The new Venus Engine delivers approximately twice the processing power (in comparison with DC-S1H), enabling high-speed processing of the new sensor’s higher pixel counts, higher-resolution and higher-bit-rate video. It has evolved with three key technologies: New Intelligent Detail Processing, New 2D Noise Reduction and High Precision 3D Noise Reduction for video…read more

 

Galaxy S22 Ultra: the most powerful cameras in a smartphone

Galaxy S22 Ultra: the most powerful cameras in a smartphoneThe new Samsung Galaxy S22 family includes the Galaxy S22 and S22+, but if you’re serious about photography and video, nothing beats the Galaxy S22 Ultra, a new standard for mobile imaging. Confirming the rumors from recent weeks, the new Samsung Galaxy S22 Ultra merges the best Galaxy features of the Note and S series. The device now has the unrivaled power of the Note series and the pro-grade camera and performance of the S series – to set a new standard for premium smartphones. Featuring a built-in S Pen, Advanced Nightography and video capabilities, and battery life that lasts over a day, Galaxy S22 Ultra is the most powerful Ultra device Samsung has ever…read more

 

Sony IMX989, a 1-inch type image camera sensor for smartphones

Sony IMX989 a 1-inch type image camera sensor for smartphonesXiaomi’s new 12S Ultra smartphone, which the company will reveal on July 4, features a Sony IMX989, a 1-inch type image camera sensor that may well change the future of smartphone imaging. We’ve seen imaging companies, in recent months, promise that the future of photography is just round the corner. Xiaomi and Leica Camera announced, last May, that a new era of mobile imaging is coming, because of their cooperation, of which we may see some results in the upcoming Xiaomi smartphones. In June, Leica and Panasonic announced a partnership that will bring a new imaging world… probably to conventional cameras. This week Samsung confirmed its second 200MP camera sensor for smartphones, named ISOCELL HP3, stating that it offers ““Epic Resolution Beyond…read more

 

Other interesting stuff worth noting

There’s a lot more content that didn’t fall under the categories above so please enjoy and share.

EditMentor and Editing Entrepreneurship

Revisit 2022's top articles, reviews, tips, opinions and more on ProVideo Coalition 75Editing Entrepreneurship may not be something we often think about but the launch of EditMentor got me thinking a lot about it. Online video editor training is a dime-a-dozen these days. Everybody and their mom seem to be offering a course and there are a thousand YouTube videos that threaten to teach you to be a professional editor in 15 minutes. So many of them are focused on the software being used because that’s often the easy thing to do. And it’s the thing that seems to attract potential students. I’m as guilty as the next as I’ve been part of a lot of these training courses as well…read more

 

Is the graphics card shortage finally over?

Revisit 2022's top articles, reviews, tips, opinions and more on ProVideo Coalition 76Until recent years, the Windows crowd have had many years of bragging rights over Apple users – namely that you had to pay 2-3 times the amount to get the equivalent performance out of a Mac compared to a PC, whether off the shelf or custom built. Two things put an end to that – the release of Apple’s impressive M1 chip in November 2020 (with the M1 Ultra now coming to the desktop Mac Studio) and the GPU shortage that started in the same year, which pushed prices up for PCs and sometimes meant you couldn’t get your hands on your preferred graphics card at…read more

 

Apple Motion — every editor’s secret weapon

Revisit 2022's top articles, reviews, tips, opinions and more on ProVideo Coalition 77In the Final Cut Pro ecosystem, Motion is the app that lets you create custom titles, effects, generators and transitions, and then use them live on your timeline in FCP. On the Adobe side of things, MOGRTs have only copied this feature in the last couple of years, and that’s truly surprising, given how much all the NLEs borrow features from one another. While the uses of Motion are more obvious to Final Cut Pro editors, if you cut with a different NLE, there are still many reasons why you might want to keep this app in your back…read more

 

The Magic of Motion

Revisit 2022's top articles, reviews, tips, opinions and more on ProVideo Coalition 78Even if you don’t need to create motion graphics like animated title sequences and transitions, and even if you have no time to learn a new application, in a matter of minutes you can use Motion to create useful tools for Final Cut Pro. Like an adjustment layer for adding color corrections or transformations to multiple clips at once. Or a trackable object remover for cloning out unwanted content in a video clip. Need to get 3D objects into Final Cut Pro? Motion is…read more

 

CommandPost adds support for the DaVinci Resolve Speed Editor

Revisit 2022's top articles, reviews, tips, opinions and more on ProVideo Coalition 79This will be welcome news to some, the venerable post-production tool CommandPost has added support for the DaVinci Resolve Speed Editor hardware console. This is a big deal as many didn’t think this was possible. The Speed Editor had seemed pretty locked down, at least as far as software interface goes, and accessible only when running the Resolve software. But leave it to smart engineers like the father of CommandPost Chris Hocking to get it…read more

 

Old lens bad

Front view of an anamorphic lens with the internal elements visibly distorted by the cylindrical front element.

Being an optical designer has been a thankless task for the last few years. You get to spend half a decade bent over a five-figure piece of optical design software, carefully compromising all the ways in which lenses fail in order to crank out a set of focal lengths that’s consistent and beautiful and works in the sort of light where it’s possible to count the photons as they wander…read more

 

Series

There are a number of series running on ProVideo Coalition. Some in 2022, others for many years.

This series from Chris Zwar is deep and ongoing. ⬇

Color Management Part 1: The Honeymoon is over

Revisit 2022's top articles, reviews, tips, opinions and more on ProVideo Coalition 80Welcome to the first part of my new series on color management.  Although it’s exciting to launch a new series, there’s not that much to get excited by when it comes to color management.  It’s a topic that I feel I need to cover, rather than something I want to cover.  Honestly, color management is a royal pain in the rear.  But at the same time, we’re at a point where color management can’t be ignored completely.  Hence the title of the introduction: The honeymoon is…read more

 

Color Management Part 14: Combining OCIO and After Effects

Revisit 2022's top articles, reviews, tips, opinions and more on ProVideo Coalition 81OpenColorIO is the industry standard color management tool for animation and visual FX.  After Effects also has its own, different color management system based on ICC profiles.  Generally, these are often thought to be mutually exclusive – you either use OpenColorIO or After Effects. We’ve spent the last few videos going over this – in Part 10 we looked at the native color management system that’s built into After Effects.  In Parts 11, 12 & 13 we looked at why OpenColorIO and ACES were developed, and how to use then in…read more

 

Rich Young’s famous After Effects Roundups land monthly. ⬇

After Effects Roundup January 2022

Revisit 2022's top articles, reviews, tips, opinions and more on ProVideo Coalition 82We’re in between releases for After Effects, although NAB 2022 is only a few months away. The year begins with a small sample of what’s happening: some troubleshooting, Photoshop chops, maps in After Effects, keyboard shortcuts, various plug-in demoes and tips, and more. The After Effects User Guide is now our main reference and contains links to New features, Keyboard shortcuts, and various FAQs, including How to fix common After Effects…read more

 

After Effects Roundup of November 2022

Revisit 2022's top articles, reviews, tips, opinions and more on ProVideo Coalition 83Appearing this week, not November, were somewhat meager Adobe Video updates: 23.1 and Frame.io RED Camera to Cloud. [After Effects 23.1 released a few days later]. Piling on more on the October MAX 2022 release, below is a November Adobe Meet Up video with some of the After Effects team and others. Mentioned in that video (though the MAX2022 store is closed) you could get the official AE and PR pillows at the main Adobe Store. They’re paired, and not cheap when shipping cost is added. If inventory runs out, there are similar offerings elsewhere. Maybe a sticker is…read more

The Useful Tools for Editors, from Scott Simmons, only had one edition in 2022 but the series has run for over a decade on multiple websites.

Useful Tools for Editors: NAB Happened This Year Edition

Revisit 2022's top articles, reviews, tips, opinions and more on ProVideo Coalition 84NAB happened this year. There were many different things to see. I wrote a bit about it in addition to PVC’s video coverage. Since there were many useful tools on the NAB show floor it seems appropriate for a fresh Useful Tools for Editors as I’ve got some things I’ve had bookmarked for…read more

 

]]>
https://www.provideocoalition.com/provideo-coalition-top-content-2022/feed/ 0
AI Photo Enhancement Software – A Time Machine for Documentaries? https://www.provideocoalition.com/ai-photo-enhancement-software/ https://www.provideocoalition.com/ai-photo-enhancement-software/#respond Fri, 19 Aug 2022 19:27:39 +0000 https://www.provideocoalition.com/?p=255989 Read More... from AI Photo Enhancement Software – A Time Machine for Documentaries?

]]>
I’ve been watching a lot of docs and docuseries the past couple years while working on a high-profile feature doc myself, producing VFX, roto and archival image retouching and enhancements. What I’m seeing is not only tons of scanned photos that never get cleaned-up/retouched, but also zooms into old grainy images that barely resemble the people photographed. In today’s 4K streaming world, you have to do better. MUCH BETTER! Enter the age of AI Photo-Image enhancement software. And we’re only just beginning to see the incredible tools that can help us dig into the past and bring people back to life!

AI Photo Enhancement Software - A Time Machine for Documentaries? 100
File Photo of Marylin Monroe and Jane Russell upscaled in Photoshop and AI Enhanced with Remini Web

I’ve been restoring and retouching photos commercially since the mid-80s; originally using an airbrush on an enlarged photographic prints and then copy-photo’d to reduce and tighten the details for print. While the tools definitely changed over the years starting on Letraset ColorStudio in 1990 and Adobe catching up in Photoshop 2.5 shortly thereafter, the practice has still been pretty much the same for decades. Until recent years where Adobe has added significant advancements in their tools for retouching, like Content Aware Fill and the indispensable Healing Brush, we can digitally restore and repair old photos in mere minutes. It truly is magical compared to how we did things in the past.

But often these historic and archival photos that have been rumbling around in a shoebox or stuck to the pages of a family album aren’t just damaged over time from mishandling – that all still needs to be addressed, of course – but the fact that film cameras had limitations that inhibit clarity by nature. Most issues I’ve seen while working on thousands of images are out of focus/mis-focused subjects or the subjects are too far away from the camera to get as much detail as you’d hope, or just using inferior consumer grade cameras or not enough light so there’s motion blur. These issues hold true even if you’re scanning directly from the original negatives or slide transparencies. Even most archival stock images are too low resolution and lacking real facial clarity when zooming in for documentary and journalistic purposes.

Enter the brave new world of AI generated image enhancement and recovery software. But is it really amazing software coding or is it witchcraft?

Let’s take a look at just a couple of the top options available today and how they stack up… for now!

AI Photo Enhancement Software - A Time Machine for Documentaries? 101

Topaz Gigapixel

Topaz Gigapixel ($100 Desktop Application) has been around for a few years and I’ve used this application, as well as their entire suite of tools, many times with great success. It does offer a lot of control and options for refining your images and lets you preview before/after scaling in the app before committing to it.

However, it often distorted faces if there wasn’t a great amount of detail. Until recent updates have included AI Face Recovery, you just got what you got and had to do further retouching to make the subject pleasing at all.

AI Photo Enhancement Software - A Time Machine for Documentaries? 102But now it does a pretty decent job of reconstructing facial details and retaining the lighting/shadows on your subjects without it looking like you copy/pasted a new head on them.

Here I have first retouched the original image of my paternal grandparents (was captured with a DSLR from a family photo on a wall) and converted to B&W. I then upscaled it only 200% and applied the Facial Recovery at about 50% to get the result on the right.

This is the true test of any enhancement software, is that the subjects look exactly how you recall them and not an artificially-created face slapped on top.

AI Photo Enhancement Software - A Time Machine for Documentaries? 103

A real test is taking a small web archive image off a public domain stock site and making it useable.

You may ask “why would you ever do that?” and most documentary filmmakers may attest to having to source some materials from a certain time period in their subject’s life that just aren’t obtainable any other way or the originals have been lost. As long as you can obtain the legal rights to your image sources, you have to make it work if you want to include the image to support your story.

This example of singer/actress Joyce Bryant is a worst-case scenario just for testing purposes. She was a gorgeous woman and I wanted to see just how well the software would hold up, starting with an image this low-res at 293×420 px.

AI Photo Enhancement Software - A Time Machine for Documentaries? 104
400% upscaling of small web archive photo of singer/actress Joyce Bryant using Topaz Gigapixel

As you can see, it did a pretty amazing job overall when the Facial Recovery strength is boosted up to 90%, but you can see it’s making her face too smooth and creating some strange artifacts in her hair and gave her crazy eyes. Granted, looking at the before/after, there just aren’t a lot of pixels to work with from the original!

But scaling is still what Gigapixel does best, even with incredibly small and pixelated images. This image off a web-based public domain archive was originally only 332×272 px and was upscaled 400% to a more useable 1328×1088 px. While not really great for extreme zooming in on facial features, the entire image would still look a lot better in your 4K doc than the original.

AI Photo Enhancement Software - A Time Machine for Documentaries? 105

Facial Recovery works quite well in most cases but still doesn’t always get the details right, especially with the eyes. I’m sure that this may get better as the technology and neural training improves, but for now, there would still be a bit of Photoshop work to correct the images further.

AI Photo Enhancement Software - A Time Machine for Documentaries? 106

___________________________________________________________________________________AI Photo Enhancement Software - A Time Machine for Documentaries? 107

Remini Web

Remini Web ($4.99 per week, Web Browser portal only) is a no-frills, no controls/options drag-and-drop facial image enhancer. But OMG it’s almost scary how well this software works!

This web-based portal lets you just drag-and-drop your JPG or PNG images into your web browser in in a few seconds (literally less than a minute in most cases) it will produce an enhanced image for you to download.

AI Photo Enhancement Software - A Time Machine for Documentaries? 108

The resulting details are so pronounced they’re actually often fake and plastic looking – like overly-retouched in portraits and images that already have a lot of facial details and structure that just need a bit of refinement.

AI Photo Enhancement Software - A Time Machine for Documentaries? 109

AI Photo Enhancement Software - A Time Machine for Documentaries? 110So I typically overlay the Remini AI Enhanced image in a Photoshop layer and simply set the opacity between 50-70% until it blends in with the original in texture and grain but reveals much more detail. This provides a very pleasing and useable image to zoom into.

In this portrait image, 50% seems to work well, just providing enough clarity and detail to make the image pop.


AI Photo Enhancement Software - A Time Machine for Documentaries? 111

Here’s another before/after image from a DSLR photo of a picture hanging on the wall of my parent’s wedding

AI Photo Enhancement Software - A Time Machine for Documentaries? 112
Before/After with Remini Web enhancement bended with original image in Photoshop

While Remini Web is probably the best facial reconstruction and enhancement tool I’ve seen yet, it doesn’t allow you much control. Based on the image you drop into it, it will produce an image that has a set maximum resolution of 2048*2048 px, and not all faces in an image may be enhanced, depending on the clarity of the subjects visible in the photo.

AI Photo Enhancement Software - A Time Machine for Documentaries? 113

I’ve experience very few failures so far, and even with a crowd in a concert or bar scene, I’ve seen Remini Web enhance faces in the audience as well and the foreground subjects. You can always crop out sections of an image and only enhance those areas you wish or simply mask them out in Photoshop afterwards.

Let’s look at a previous image test that I also ran through Photoshop and Remini Web which gives spectacular results. I simply used Photoshop to upscale the image using Bicubic Smoother (Enlargement) and then ran it through Remini Web for processing and overlaying the AI Enhanced layer opacity to 70%. If I wanted to use an uncropped version of the finished image, I may actually combine Topaz Gigapixel and the Photoshop/Remini Web images to get the most clarity of the entire scene.

AI Photo Enhancement Software - A Time Machine for Documentaries? 114

You can notice though, what a spectacular job Remini Web does with facial details – especially the eyes!

But First – Retouch Your Damn Photos!

Ok people – you know who you are! I’ve seen so many dreadful scans in your documentaries that my OCD gets triggered and I can’t pay attention to the story anymore! I mean, C’mon – you MUST at least hire an intern to Photoshop out the basic things, like scratches, hairs, dirt, creases and tears with the healing brush, right? There’s literally NO EXCUSE for zooming in on a scanned photo that’s absolute crap! Ok… I guess I’ll need to write up a photo retouching tutorial and post it here soon.

But seriously, even if you do minor retouching or you don’t think your image needs it, running it through AI Enhancement software may hide some noise, but it will only accentuate dirt, scratches and hairs in your original image. I’ve even found stuff I need to do a second pass retouching on that I may have missed the first time before enhancement.

Click on this image and enlarge it to see how Remini Web actually magnified some of the dirt from the original unretouched scan. Just a minute or two of clean up can save your enhanced image and make it ready to pull into your project.

AI Photo Enhancement Software - A Time Machine for Documentaries? 115

Let’s look at a much worse example, with this historic photo of Honest Abe. I only used the Spot Healing Brush Tool in Photoshop to remove all the dirt, stains and hairs from the public domain archive scan. I spent about 15 minutes doing a quick and dirty pass.

AI Photo Enhancement Software - A Time Machine for Documentaries? 116

Then when I ran it though Remini Web, the details in his skin and hair are predominant and not the dirt.

AI Photo Enhancement Software - A Time Machine for Documentaries? 117

You’ll notice that Remini Web does add a slight bit of color in most cases – even on B&W images, but you can pull out the color saturation easily in Photoshop. The final result is a natural and believable image when done correctly, and your finished project is going to look great! Click the image and enlarge to see more details.

AI Photo Enhancement Software - A Time Machine for Documentaries? 118
Original image, Photoshop Retouched, Remini Web Enhanced and blended final

Stay tuned for more tips on retouching your images and updates to this amazing software technology!

]]>
https://www.provideocoalition.com/ai-photo-enhancement-software/feed/ 0
Common Mistakes in Online Video https://www.provideocoalition.com/common-mistakes-in-online-video/ https://www.provideocoalition.com/common-mistakes-in-online-video/#comments Mon, 24 Jan 2022 12:18:47 +0000 https://www.provideocoalition.com/?p=249499 Read More... from Common Mistakes in Online Video

]]>
Common Mistakes in Online Video 121While I’m not yet old, today I am grumpy, and YouTube’s starting to annoy me: there are so many common mistakes in online video. Facebook’s algorithm pushes me a stream of unwanted garbage after every video I actually choose to see; TikTok is a colossal waste of everyone’s time, and even Meta knows that Instagram is bad for your mental health. While YouTube has its dark pits and poisoned rabbit holes, it’s large enough to offer thousands of solid videos worth your time, and even pay creators for theirs.

With the rush of fresh creators, there’s a flow of new ideas, new techniques, and new ways to communicate, and these have already changed the way we make videos online. But not every idea is a good one, and every new video creator discovers this in their own way, by making mistakes and bad videos. But you don’t have to blunder blindly, because I made a list of things to avoid.

While these things all annoy me, they may not annoy you. But be careful: you’re making videos for an audience, and I can say pretty safely that I’m not alone here. After all, the reason you should never use Comic Sans is not (just) because it’s a bad font, but because a large chunk of the audience can’t stand it, and will think less of you for using it.

All that said, every rule has its exception, and I’m sure you can find a counterpoint to everyone. Let’s get into it.

Not getting to the point immediately

If you dig through YouTube’s analytics, you’ll see that the vast majority of your audience has drifted away well before the end of the video. That’s OK — many of those people simply found your video wasn’t for them. But the number of videos that open with a casual introduction, then an animated logo, then ask me to like and subscribe before giving me a reason to do so… boggles the mind.

Common Mistakes in Online Video 122
These sad numbers are “typical”? Oh dear!

Open with a one-line summary, tell people quickly what the video covers, and then, if you have to, add some branding. Please.

For a fantastic example of how to get to the point quickly, check this out ⬇

Lazy editing

Jump cuts have become endemic on YouTube, but it’s a classic case of “just because you can, doesn’t mean you should”. If you can avoid jump cuts, do — and you should be able to. Plan and shoot some b-roll to cover your edits. Record a second take if there’s a mistake mid-sentence, or simply don’t include so many talking headshots. An edit isn’t a jump cut if we can’t see it.

Common Mistakes in Online Video 123
Cover any jump cuts with b-roll

A video that doesn’t have jump cuts (or at least not many) will appeal to a wider audience because it looks smoother and more professional. Of course, if you want to make jump cuts part of your style, go ahead, but you’re jettisoning some portion of your viewers on the way there.

Only using handheld cameras

Yes, you do need a tripod or some kind of stabilizer. Handheld footage is fine in small doses or for quick action, but not as the main camera, and especially not if you’re constantly moving.

Common Mistakes in Online Video 124
Handheld footage (especially moving footage with everything in focus) compresses poorly because every pixel changes in every frame. See the full video from that still image above.

Quality will go way down (handheld footage falls apart with YouTube compression) and you’ll even make some portion of your audience sick. Take a little longer, lock off your main camera, experiment with b-roll, and you’ll make more people happy.

Lowest common denominator thinking

You can’t please everyone, but if you only aim for “looks OK on a phone”, you’ll miss the mark. Instead, aim higher, using a better camera than average, with a separate microphone, with a tripod, and spend more time editing. You’ll get results that please people on phones and on larger, better screens with larger, better sound systems. Anyone with a 4K display or a Mac is sitting close enough to see the difference between HD and 4K, and many of these people are decision-makers worth impressing.

Common Mistakes in Online Video 125
MKBHD shoots RED, maintains stellar production values, and you can see the difference, even on a phone

Note that aiming high isn’t just about image quality; it’s important to everything, including script, audio, graphics, editing; the lot. We are all fussy about our particular specialties, and — unless your content is pretty special — failing at any one area can be enough to lose part of your audience.

Using burned in subtitles rather than closed captions

Captions are great, and very important for accessibility, but not everyone wants to see them all the time. For anyone who can’t hear or chooses not to, they’re very important; for another chunk of your viewers, they’re an annoying distraction. You can make everyone happy, though!

Common Mistakes in Online Video 126
If you hard-code your captions, they’ll often be unreadable

If you burn the captions into the video (called Open Captions) then everyone sees them all the time, they’ll sit underneath the playback bar, and they are not searchable. Boo.

But if you provide captions in SRT format as a separate sidecar file (called Closed Captions) the user can choose to view them or not, they can be shown automatically on Facebook or Twitter while muted, and they’ll automatically move out of the way when the playback bar is shown. Talk your clients out of Open Captions and firmly into Closed Captions, and you will please more people.

One exception — not every platform supports Closed Captions. If (for your sins) you’re making videos for TikTok, burn away.

Not considering your audience

Always remember that you’re making a video for them, not for you, and you should talk about what they’re interested in, from their perspective, and tell them the information they came to find.

⬆ If you want to (for some reason) compare Telsa floor mats, this is straight to the point, all the information you need, and little you don’t — search to find many videos that don’t nail it as this did.

If you’re an expert, it’s easy to lose perspective because you already know a lot about your subject, so remember to try to think like one of your viewers. Find the balance between too much information and not enough; be informative but don’t go off on too many tangents. Focus.

Telling rather than showing

While it’s easier to make money from YouTube ads than ads next to a written article, a talking head simply reading a script, or speaking their mind, makes for a lousy video. We have podcasts for voice, and we have articles for the written word.

Common Mistakes in Online Video 127
This 38-minute video was on-topic, but I’d much rather read the same content in 5 minutes

To use the medium properly, demonstrate concepts whenever you can and show the things you’re talking about. Talking heads are boring, and you can do better.

Expecting a viral hit every time

If there was a magic formula for viral success, it would cease to work, because there aren’t enough eyeballs to watch all the videos out there. But that’s OK; most people just need to find the right audience, not a huge one. The YouTube algorithm is capricious, and the tactics that worked a few years back for today’s YouTube stars probably won’t work again today. A  few thousand views is actually pretty good for most people.

⬆ Veritasium did a deep dive on what makes a video viral

Even if your thumbnails are heavily clicked on, the biggest YouTubers will continue to get millions of views, but most never will: make sure your clients understand that. Set expectations low, keep quality high, and be patiently consistent if you want to build an audience.

Boring titles and graphics

While you don’t need a long, flashy opening sequence, if you want to stand out and be memorable, avoid the default graphics and default fonts in whatever editing software you use. They’ve been seen many times, and you won’t have a unique voice if you don’t get creative.

Common Mistakes in Online Video 128

To build consistent branding and an audience put in a little effort up front. Pick a good, non-standard font and use it consistently. Consider creating a logo. Design or buy a set of titles just a little different from what everyone else is using, then change them to use your font and colors. Keep it smooth and simple.

Crazy titles and graphics

The opposite of boring, default titles and clichéd default transitions is a wild overuse of lens flares, fancy color correction and the craziest graphics. A few years ago, it was LUTs and slow-mo. Then light leaks. Then crazy in-camera transition moves.

Common Mistakes in Online Video 129
I confess, I love a good digital glitch, but this video is about gaming, and it’s not for everyone

Some of this stuff can be great, but if you’re distracting from the message, dial it back. You don’t have to chase every trend, and they’ll all eventually play themselves out.

Poor audio

If I can’t hear you, I’m out. If I have to ride the volume levels to hear you, I’m out. If I’m deafened by your soundtrack, I’m out — and while I don’t mind continuous background music myself, a chunk of your viewers absolutely hate it.

Common Mistakes in Online Video 130
A lapel mic like the RØDE Wireless Go II is the simplest way to get clean audio on any camera

Consistent, clean-ish audio is a baseline you have to meet. Get a lapel mic, keep it close to your mouth, re-record if a truck goes past, and you’re 95% there.

Figuring it out live on camera rather than planning

A video which distills the experience of a product is far more valuable than one which just shows first impressions. Don’t just shoot it all live and cut out the bad stuff — instead, first write a script or even a rough outline.

Common Mistakes in Online Video 131
A script doesn’t need to be complicated, but putting shots next to words will help you focus

With a plan, you’ll be able to decouple your words from the vision, recording better b-roll (like close-ups) and then placing them at the right points of your video. I don’t need to watch your camera refocus, and I don’t need to see you make mistakes or figure anything out live on camera.

Using cheap clickbait techniques

It might work, but it doesn’t bring respect. No matter how good your information is, if you present your work as “10 secrets you MUST know” or “You won’t BELIEVE what happened next” or “Camera X gets DESTROYED” then you’re a less credible source of information. By all means, tantalize your audience, but don’t push it too far.

Common Mistakes in Online Video 132
This stuff has been trendy for some time, but I suspect that resentment is growing

If it didn’t work then people wouldn’t do it, but bad clickbaity thumbnails mislead the audience instead of informing them. You can do this well, or you can do this cheaply, and in the long run, it’s not sustainable.

Watch another Veritasium video on clickbait for more. ⬇

Conclusion

Enough! I could go on, but I already feel ten years older, so I’ll leave it for now. (Actually, if you’re still stuck, I’ve got a book for you to read.)

If you don’t agree with some of these points, that’s fine: please tell me in the comments, like and subscribe, then make a long talking-head video about it, and — you won’t believe what happens next.

]]>
https://www.provideocoalition.com/common-mistakes-in-online-video/feed/ 3
Matching Microphones in Final Cut Pro https://www.provideocoalition.com/matching-microphones-in-final-cut-pro/ https://www.provideocoalition.com/matching-microphones-in-final-cut-pro/#comments Thu, 09 Sep 2021 15:05:36 +0000 https://www.provideocoalition.com/?p=244039 Read More... from Matching Microphones in Final Cut Pro

]]>

When I shoot MacBreak Studio, I use two microphones: one for the on-camera portion, and one for the user interface demo. The on-camera mic is a Sennheiser shotgun mic connected via an XLR cable directly into my Sony A7SIII through the K3M adapter. It’s convenient because I don’t have to sync dual-system audio in post, which I had been doing for years previously by recording on a lavalier mic into a Zoom H5. I love the fact that I don’t have to sync in post, and I don’t have to mess around with the lav mic every time I shoot. The shotgun mic is on a boom pole mounted to a C-stand and remains ready to go at any time.

However, when I’m recording the screen for a tutorial with Screenflow, I use a different mic – a Blue Designs Yeti from Logitech, mounted on a boom arm. It’s connected via USB directly to my computer. Technically I could use that same Sennheiser shotgun mic by plugging the XLR cable into my Focusrite Scarlett 2i2, but that would necessitate constantly reconfiguring or using a splitter – and besides, I actually prefer the sound of the Yeti mic. And since I haven’t found a way to connect that Yeti mic directly to my camera, I’m left using two different mics.

The problem is, these mics sound very different from each other – jarringly so. Therefore, I needed to match them to each other, ideally making the Sennheiser sound more like the Yeti mic that I prefer. Thanks for Final Cut Pro’s Match Audio feature, it’s not difficult to do. What you may not realize however, is you can use this feature in both directions. In other words, after matching the shotgun mic to the boom mic, which gets them in the neighborhood, you can then match the boom mic to the modified shotgun mic to get them even closer.

Then, by analyzing the generated waveform mapped to the frequency spectrum of each mic, you can tweak the EQ to get them even closer. Check it all out in the video above – see if you can hear a difference when it switches from the on-camera intro to the demo portion. And if you want learn more about how to use Final Cut Pro’s excellent audio tools, check out Steve’s Sound Editing in Final Cut Pro.

]]>
https://www.provideocoalition.com/matching-microphones-in-final-cut-pro/feed/ 1
5 Things https://www.provideocoalition.com/5-things/ https://www.provideocoalition.com/5-things/#comments Tue, 10 Aug 2021 18:03:25 +0000 https://www.provideocoalition.com/?p=242548 Read More... from 5 Things

]]>
5 Things

Distractions, Lists, and Creativity

I crashed my motorcycle on this day, August 10th, 21 years ago. It was both a visceral lesson in the risks of distraction and the beginning of my video career.

It was a warm and cloudless summer day when I headed out with a group of friends for a planned five-day ride through Northern California and Nevada. It was probably my fourth time joining these guys for a multiple-day ride over the past several years.

After a glorious morning start heading north from San Francisco, we stopped in the small town of Willits for lunch before continuing on into the afternoon, taking a smooth quiet road that twisted and carved through hills and valleys for mile upon mile. The endless yellow ribbon stretched in front of us. I was enjoying the steady thrumming of my engine and the floating, flying feeling that only motorcyclists know, coming up out of one curve and throwing my bike into the next one. I had learned by now that my companions were better riders than I, so I didn’t make any attempt to keep up with them.

Besides, I was a little distracted.

Canon Elura
My first video camera

I’d strapped a Canon Elura mini-DV video camera to my leg with about a dozen rubber bands. After experimenting with mounting it to my helmet (on a tiny flexible tripod with lots of duct tape, where the wind would immediately tilt it back to shoot nothing but sky and telephone wires), and to the bike itself (on a metal pole jammed through the rear frame where it would twist to shoot nothing but the side of the road), I found this mounting method reduced vibrations the most, and had an added benefit: by turning my leg I could point the lens in different directions, catching the scenery or my riding companions as we passed each other. I could even turn it off and on while in motion by reaching down with my left hand and pressing the record button, keeping my right hand steady on the throttle. The only hitch? After doing that a few times, I couldn’t remember if the camera was on or off. 

 

5 Things 134
Moments before the fall.

With my focus split between the road and my desire to capture our adventure, I came into the next turn – a tight left – too fast. Quickly, it tightened up more: a dreaded decreasing radius turn, nemesis of riders everywhere, made worse by the flat, unbanked surface.

You turn a motorcycle at speed by counter-steering: to turn left, for example, you push on the left handlebar. Intuitively you might think this action would make the bike steer to the right, but gyroscopic and centripetal forces make the bike lean left and therefore turn left. The harder you push, the more the bike leans and the tighter it turns. Push hard enough, and you start scraping parts of the bike on the ground – usually starting with the foot pegs, which have metal bolts attached for this purpose: a kind of early warning signal. To avoid scraping, you can shift your whole body off to the same side, which reduces the amount of lean required.

leaning into the curve
Leaning and body-shifting to turn. Racers drag plastic knee pucks (stock photo)

I had been riding for about 10 years before this moment, including 3 track school classes at Laguna Seca Raceway, so I had a grasp of what to do in this situation.  5 Things 135

laguna seca
The author at Reg Pridmore’s track school at Laguna Seca, November 1992

It was too late to shift my body, so I pushed harder into the left handlebar, forcing myself to look up through the turn and not at the front wheel which was dangerously close to the edge of the pavement. I kept steady on the throttle, fighting the instinct to slow down: if I let up, the bike would pitch forward. So far so good. 

Then my left foot-peg dug into the asphalt.

Now, I’ve only leaned my bike over far enough to scrape a peg one other time: at track school, in the back half of Turn Two. That was also a tight left-hand turn, but the track was very wide and I was tight to the inside. At that time, the sudden scraping noise shocked me: I let up pressure on the left handlebar, and the bike skittered to the right a good 5 feet before I recovered and continued through the turn.

On this beautiful California afternoon in the bright sunshine, I didn’t have the luxury of those 5 feet: I was already up against the white line at the edge of the pavement. Once again, the sudden loud scraping noise spooked me and I reacted by relaxing pressure on the left handlebar – just a bit and just for a fraction of a second – but that was all it took for the front wheel to slide off the edge off the road into gravel. It immediately gave way. 

I loved riding motorcycles. But I also had a healthy fear of them. In the past decade of riding I had crashed once before. So my first thought as the front tire gave way was this: if I make it through, this will be my last ride.

The rear wheel was still on the road with solid traction, so the back of the bike swung forward around the front.  Like a kid flicking a spitball off of a popsicle stick, it shot me off the seat and I lost my grip.

My second thought, as I slid face-down across the gravel, arms out-stretched to break my fall, was the realization that if I had been turning right instead of left, I’d now be sailing off a cliff. 

Behind me, my bike righted itself and barreled into the side of the mountain, smashing the plastic cowling, twisting the forks, and contorting the frame. 

I came to a rest flat on my belly in a cloud of dirt. It was eerily quiet. My third thought, as I picked myself up and checked all my limbs, was: did my camera survive? And did I get the shot?

Thanks to my helmet, gloves and full leathers, I was unscathed. The rest of our group was well ahead of me, except for John, who almost crashed trying to avoid running into me. He confirmed that I was ok. And that my bike was totaled. Still dazed, I wrestled the camera out from the rubber bands around my leg. Dented, scratched and covered in dirt, I was amazed to find that it turned on. Excited, John and I both bent over the tiny flip out screen as I rewound the tape and pressed play. 

Nothing. I had turned it off just before crashing.

That accident taught me to focus. Back home, I used the insurance money to replace my camera instead of my bike, upgrading to a “professional” (at the time) Sony PD-150.  

5 Things 136
The sturdy PD-150

I turned our downstairs storage room into an edit suite, bought Lisa Brenneis’ Final Cut Pro book, Brian Maffit’s After Effects video tapes, and Trish and Chris Meyer’s Creating Motion Graphics tome. I locked myself in that edit suite for the next few months, learning everything I could. I shot every chance I got, and joined the very first Final Cut Pro user’s group in San Francisco.

The connections I made there let to my first book, which let to my second book, to NAB and meeting my creative partner Steve and the start of our 15-years working together at Ripple Training.

 

The New Distractions

I’m telling you this story because, in the twenty-one years since that day, I’ve noticed a new kind of distraction taking ahold of me. Not as deadly as playing with video cameras while riding motorcycles, and not one that might lead to a new career. It’s stealthy, it’s pernicious, and it’s slowly sapping my productivity and creativity. Maybe yours too.

Video professionals are storytellers at heart. And storytelling is a creative act. As such, it requires both exceptional receptivity and focus: receptivity to new connections and ideas that often appear during those in-between times when your mind is resting, or engaged in a completely different activity; and focus on the execution of our craft, be it writing a script, shooting a scene, or editing a complex sequence.

Now I find that focusing during a shoot is easy: I’m working with other people on a common goal under immediate pressure. But my ability to focus while alone in front of a computer screen has decreased steadily as the channels of communication and the amount of online content have exploded.

Back when I was first learning After Effects, I might have had my email application open, and I’d get perhaps a dozen messages a day. And I had a browser window open to the Creative Cow website, whose forum contributors helped me learn so much (thank you, Ralph Fairweather, RIP).

Today, as I write this, just for text communications I actively use Mail, Messages, Facebook Messenger, Slack, and Telegram. I probably get 200 messages a day. To say nothing of all the compelling and often high-quality educational content on YouTube, and Facebook, and Twitter.

Even with notifications turned off, when my current task gets the slightest bit difficult, I’m just a click away from all of those “urgent but not important” communications. Look, a review about a new camera! Oh, here’s a chat with the editor of Mare of Easttown – I loved that show, maybe I’ll learn something! Oh and I better the check comments on our latest tutorial video, I may need to respond. Hey, I wonder what folks are saying about Final Cut Pro in all those Facebook groups I’m in. Ah, there’s a new message in that Resolve Telegram group!

It might be just 10 seconds here to skim and delete emails or 2 minutes there to read a short article or watch a “how-to” video, but the cumulative effect of these micro-distractions add up. Not only do they steal precious minutes of my day, every time I return to the task at hand, I have to refocus and get myself back in the flow. In fact, according to one study, It takes over 20 minutes to get back “on task.” It’s a slow death by a thousand cuts – or as a friend long ago put it with a less gruesome metaphor: I’m getting bitten to death by ducks.

Are you?

The Five Things

So that brings me to lists. You know, online article or videos titled “the 5 things you really need to know about.” They are everywhere. Why? Because they work. People click on them. I love them. Hey, I’m not getting distracted, this is something I need to know! Just while writing this article, I’ve “learned”:

  • 10 Tips for Shooting Cinematic Smartphone Videos
  • 7 Lighting Tips
  • The Top 8 Seamless Transitions
  • 5 Great Weekend Reads
  • The 7 Worst Habits for your Brain
  • The 6 Concepts You Should Know to Be Financially Literate
  • My Top 7 Features of DaVinci Resolve 17 for Craft Editors (by our very own Scott Simmons here on PVC)

(I didn’t provide links for them all since I’m distracting you enough already!)

Did you click the link for this article planning to skim through it, quickly locate the headings for each of the “5 things”, and then mentally check them off with a satisfying “oh yes, I already knew that”? Or maybe you’ll discover something new that you promptly forget because you didn’t immediately put it into practice? I know that’s what I do. So, am really I learning something – or am I just distracting myself?

For me, I think it boils down to two kinds of distraction: the kind that I use to avoid hard work, and the kind that can refresh me, really improve my skills, give me new ideas and renew my focus.

I don’t know how to help you with the first kind – as you can see, I’m struggling myself. But here are five practices that I believe fall in the second category: “distractions” that help me with my creativity. I try to do them every day. I usually fail. But I believe they can open us up to more sources of inspiration, greater creativity, and a better ability to focus on doing excellent work. 

 

Thing 1: Read a book. 

I mean read a physical, tree-killing book that sits on your nightstand and reminds you to read it. Make it something you know nothing about. Read it in a comfy chair – away from the computer. Even for 5 minutes at lunch instead of scrolling. You’ll enjoy it, and you’ll get inspired in a way you didn’t expect. Visiting family back east a few weeks ago, I came across Steven King’s On Writing in a local bookshop, bought it, and devoured it. However in my daily home work routine, I find it much more difficult to do this. I have a stack of books by my bed, but if I’m lucky I manage a couple of pages before turning in.

 

Thing 2: Write. Or meditate. Or both.

Some folks don’t like to write; some can’t sit still. I’m betting you can do one or the other. A daily writing routine – even simply journaling 10 minutes of stream-of-consciousness – can clear out mental debris and help clarify your thinking. Check out The Artist’s Way by Julia Cameron. I try to write for a least a few minutes every morning. I don’t always succeed.

A regular meditation practice can have the same results. I’ve been practicing for over 30 years. These days it’s just a few minutes on the cushion right after I first wake up, but it’s enough to make me feel more present, less lost in thought – and I ofter get some really good ideas in those few minutes. Check out Sam Harris’ Waking Up app .

 

Thing 3: Exercise.

Some people love it; some hate it. It works for me. Get outside every day at least for a short walk. I’ve been doing yoga for almost 20 years now and it has changed my life. But somedays my morning dog walk is all I get in. The rest of the day flies by and suddenly it’s getting dark and I didn’t leave my office – but I’m glad I saw the sun and the hills for a few moments in the morning.

 

Thing 4: Play an instrument. Or draw. Or paint.

Develop a creative outlet apart from your work. I started playing guitar a little over 10 years ago. I practice almost every day, even if it’s just a few minutes before going to bed. I’ll never be great. But I love it, and it’s an excellent mental workout.

 

Thing 5: Practice your craft.

If you shoot for a living, do a personal project. If you are are editor, cut something something new – a comedy, or a fight scene. Do something small every day. 

And here’s the thing: don’t just read or watch tutorials to improve your craft. Put what you learn into practice as soon as possible. I know that’s easier with a quick editing tip than learning to shoot green screen, but you get the idea. I watch plenty of YouTube videos to improve my skills, but then I promptly forget what I learned because I don’t practice it right away.

If you aren’t going to practice it soon after watching, just skip the tutorial and do something creative instead. Note this advice is coming from someone who produces tutorials for a living! At Ripple Training, we provide projects and media for almost all our tutorials, but I’ll bet dollars to donuts that most people sit back and watch them without practicing as they go. I get it.

I realize that reading, writing, meditating, exercising, jamming and practicing a work skill every day may seem absurd in the face of a mountain of work and looming deadlines. I sure don’t do them all every day, but I like having the goal – and I don’t kick myself when I fall short. Often I’ll slip them in at the very beginning and very end of the day, because once I’m working, forget about it.

You’ll notice I didn’t include advice about “limiting social media” or turning off distracting apps. Those are probably good ideas but I just don’t do them myself. It’s a problem. I get a little dopamine hit with every new email, Facebook notification, or Youtube comment. Oh wait here’s a video about Wes Anderson’s style – I need to go watch that (it’s great by the way).

Are you doing any better than I am with all these distractions? Leave a comment. Oh, and if you have any favorite lists, link please!

]]>
https://www.provideocoalition.com/5-things/feed/ 5
Poor NAS AFP Performance? https://www.provideocoalition.com/poor-nas-afp-performance/ https://www.provideocoalition.com/poor-nas-afp-performance/#respond Tue, 08 Jun 2021 23:09:39 +0000 https://www.provideocoalition.com/?p=239669 Read More... from Poor NAS AFP Performance?

]]>
If you’re using a Mac to access files on a NAS using AFP, and it’s taking a long time to list folder contents and open files — or you’re seeing timeout errors — your CNID DB may be unhappy.

Say what?

Macs talk to networked storage using any of four different protocols: Apple File Protocol (AFP), Server Message Block (SMB, a.k.a. CIFS), Network File System (NFS), or the snappily-named Web Distributed Authoring and Versioning (WebDAV). Indeed, many NAS boxes offer three or more of these simultaneously.

Macs historically used AFP, and while Apple has deprecated AFP in favor of SMB/CIFS, AFP still has some advantages. On my NetGear ReadyNAS boxes, for example, AFP properly preserves file-creation times, while SMB does not. AFP connections have historically been faster than SMB connections, too, at least before SMB v3 became widespread. Even using MacOS 11, I’m finding that AFP connections behave more reliably and consistently than SMB connections, especially on NAS volumes with restricted access credentials. 

AFP originated with the classic Mac OS, and it handles a rich selection of Mac-specific metadata: resource forks, type and creator codes, and the like. These metadata don’t map neatly into the native filesystems of NASes — or, really, anything other than Macs. 

The critical difference at the heart of this story is that AFP, like the Mac’s file systems dating back to HFS, refers to files not by name, but by CNID: a Content Node ID. The open-source Netatalk fileserver that provides AFP services in most common NAS boxes stores the mapping between CNIDs and filenames in a database.

CNID Conniptions

As long as the CNID database is in sync with reality, AFP file services run smoothly. It’s when the database no longer matches the files on disk that slowdowns occur, as Netatalk has to resolve discrepancies on the fly. The database can fall out of sync when you access the same volume with multiple services: files created or deleted via SMB or NFS aren’t tracked in the CNID database. Drastic database de-optimization may also happen during software updates, though that is rare.

As differences accumulate, AFP services bog down, sometimes to the point where the Mac Finder times out trying to fetch directory listings. It takes longer to open files; TimeMachine, SuperDuper!, and Carbon Copy Cloner (CCC ) backups take more time to execute; you might not even be able to browse a network volume.

The Fix for AFP

Netatalk provides the dbd tool to update and resync the CNID database. 

dbd [-cfFstuvV] volumepath

As far as I know, no NAS vendor makes dbd available in their management GUI, so you’ll have to open a terminal session to invoke it (how to do so is left as an exercise for the reader, as it varies by vendor and by firmware version). dbd can be run in an incremental-update, non-destructive mode, and I recommend trying that first. In my experience, you can still perform normal file ops while dbd is running, with little impact. Here’s a run with the -t “show statistics” and -v “verbose” flags on a test volume holding exactly two files:

dbd run in update mode to improve AFP performance

(Note the “file without meta EA” warning; the “white shading.jpg” file was copied to the NAS using SMB, so Mac-specific metadata, a.k.a. Extended Attributes, did not transfer.)

If the database is so far gone that you see timeout and disconnect messages on the console as dbd itself runs, try re-running with the -f “nuke it from orbit and start fresh” flag. 

dbd run in nuke-from-orbit mode

In this case, the -f option took longer to run, but with a badly out-of-sync database it runs quite a bit faster than dbd without -f.

With my Ultra 6 boxes and their weedy little Atom CPUs, AFP access is at a standstill while nuke-it-from-orbit is running, but at least Netatalk won’t be tripping over its own damaged database and dbd will have the chance to create a new, clean version.

In my case, I had one data volume with 10+ TB of data that had been accessed from Macs via AFP, SMB, and NFS (the latter two for edit-in-place sessions with FCPX); I also had Windows 7 and Windows 10 machines writing to it using SMB. After several months of this abuse AFP access became noticeably slower. Despite running the server’s defrag and balance processes, AFP remained poky, and finally it started throwing timeouts both in Finder and during nightly CCC backups. dbd run incrementally ran very slowly, and spewed multiple timeout errors; after several hours I killed it and ran dbd -f, which ran much faster… though it still took about a day to make everything neat and clean again. But once it was done, AFP worked perfectly again. And when I run dbd in update mode on this rebuilt database, it processes 1.15 million files in a mere 638 seconds.

dbd run in update mode on 1.15 million files

I now run dbd every three months as part of regular maintenance, just like defrag and balance.

Why not avoid the problem in the first place?

A practical rule is, “if it hurts, don’t do it”, and that certainly holds in this case: pick one file-sharing protocol, and stick to it. Indeed, most NAS vendors warn you away from using multiple protocols on the same volume, though sometimes that warning is buried in a knowledge-base article instead of being presented in the user manual or management UI.*

But, warnings or not, you may still have a need or a desire to use AFP with some Macs on a filesystem that is (or was) also accessed using other protocols. Running dbd can clear up any lingering problems; adding it to your maintenance schedule can keep those problems from recurring in the future.


* “It was on display in the bottom of a locked filing cabinet stuck in a disused lavatory with a sign on the door saying ‘Beware of the Leopard’.” —Douglas Adams, The Hitchhiker’s Guide to the Galaxy

 

]]>
https://www.provideocoalition.com/poor-nas-afp-performance/feed/ 0