David Hover – ProVideo Coalition https://www.provideocoalition.com A Filmtools Company Tue, 25 Oct 2022 10:53:56 +0000 en-US hourly 1 https://wordpress.org/?v=6.3.5 https://www.provideocoalition.com/wp-content/uploads/cropped-PVC_Logo_2020-32x32.jpg David Hover – ProVideo Coalition https://www.provideocoalition.com 32 32 Solutions to Resolve Part 6: Clipped Exports – the YUV enigma https://www.provideocoalition.com/solutions-to-resolve-6-clipped-exports-the-yuv-enigma/ https://www.provideocoalition.com/solutions-to-resolve-6-clipped-exports-the-yuv-enigma/#comments Mon, 12 Oct 2020 12:00:43 +0000 https://www.provideocoalition.com/?p=227397 Read More... from Solutions to Resolve Part 6: Clipped Exports – the YUV enigma

]]>
Are you inadvertently clipping your exports from Resolve? I wrote an article about this a while ago, Solutions to Resolve 4: Full data and video levels in DaVinci Resolve – don’t clip your proxies and transcodes. It turns out that this is not the end of the story.

Jean-Pierre Demuynck, an engineer who helped develop the ITU-R BT specifications, drew my attention to a further problem I hadn’t been aware of. That’s what I’ll be sharing with you here. (If you’re not familiar with using full and video levels in Resolve, please first read the Solutions to Resolve 4 article mentioned above.)

Here’s the problem:

The flower on the left is the original image, imported into Resolve. On the right, the image has been exported from Resolve, then re-imported to see what it looks like on the scopes.

Solutions to Resolve Part 6: Clipped Exports – the YUV enigma 1
Original image (H264)                                                                   Exported image (DNxHD HQ)

Identical? Well, not quite.

Here they are again with the gain reduced and the blacks lifted. The same correction has been applied to both images.

Solutions to Resolve Part 6: Clipped Exports – the YUV enigma 2
Original image – graded                                                             Exported image – graded

In the export, the red channel has been severely clipped.

You might think at first (if you’ve read Solutions to Resolve 4…) that the solution would be to activate the option “Retain sub-black and super-white data” in the advanced settings of the Delivery page. But no, this was activated.

You can see it was activated by looking at the black and white patches on each side of the flower. The sub-blacks (black patches with values from 0 to 15) and super-whites (white patches with values 236 to 255) weren’t clipped on export and are clearly present in the graded image.

It turns out that “Retain sub-black and super-white data” option works for luminance, but doesn’t stop the colour channels from clipping.

Is there a solution for this? Unfortunately no, not really. You could choose one of the half-float codecs for delivery (DPX or EXR). They won’t clip, but they aren’t practical for standard exports. All the usual post-production codecs like ProRes and DNx will be clipped.

Before I get into the reasons for why this is happening, it’s worth taking a minute to look at the implications of this. How serious a problem is it for editing and post-production?

If you’re color grading, it’s not a problem at all. It’s the colorist’s job to manage clipping, so anything clipped on delivery is the colorist’s fault, not Resolve’s. It’s not a problem either if your rushes are RGB. This only affects YUV media.

However, since almost all the media we work with from day to day is YUV, and we don’t always grade it before exporting, the clipping remains a serious problem. If you use Resolve to transcode your rushes, it may not retain all the original signal. The same is true if you export an ungraded Master of your edit – something that is required practice for many broadcasters and production houses.

You won’t see the problem when you export, the copy will appear identical, as demonstrated at the beginning of this article. It’s when you decide to use the export later for grading that the clipping will show up. Here’s a close-up of the graded images shown above.

Solutions to Resolve Part 6: Clipped Exports – the YUV enigma 3
Original image – graded                                                         Exported image – graded

You don’t have to be an expert colorist to see that the export has lost the original clip’s deep saturation and fine color details. Grading won’t get them back.

So what exactly is happening here? When I first discovered the problem, it was a real enigma. How could the red channel possibly be so much higher than maximum luminance in the first place? There are no values above 255 in an 8-bit file…

It turns out that this is true and not true, at least where YUV is concerned. YUV can, in fact, pack a higher range of colors into an 8-bit file than RGB. It can do this because of the way the conversions between RGB and YUV are calculated.

Here are the equations: (Note: these equations convert from RGB to Full YUV levels, not to Video YUV levels which would require another step, and which I’ve left out because it makes no difference to the issue we’re discussing.)

To convert from RGB to YUV:

Y = 0.2126R + 0.7152G + 0.0722B

U = 0.551213(B-Y) + 128

V = 0.649499(R-Y) + 128

To convert from YUV to RGB:

R = Y + 1.539651(V-128)

G = Y – 0.4576759(V-128) – 0.18314607(U-128)

B = Y + 1.8142115(U-128)

You don’t need to understand these equations, but you can see they’re pretty simple, no more than basic school maths. The important point here is that they have very surprising consequences.

Take, for example, a YUV signal where Y=234, U=148 and V=195. That’s a perfectly legitimate 8-bit signal because the values are between 0 and 255. Convert that to RGB and you get R=337, G=200 and B=269 which you can’t encode in 8 bits because some of the values go beyond 255.

The equations allow YUV to record a higher dynamic color range than RGB in an 8-bit file. (The same is true for 10 bits, 12 bits, etc. – I refer to 8 bits in this article just for simplicity, the principles are the same for the others.)

Camera manufacturers take advantage of this to squeeze the most they can out of their sensors[1]. They oversample the RGB signals coming from the sensor to cover the extra range above 255 before this is converted to YUV. For example, if you use 9 bits instead of 8 bits to sample the RGB, the extra bit would cover anything from 256 to 511.

So in the example above, the signals from the sensor would be R=337 G=200 B=269, subsequently converted to a standard 8-bit YUV file with the values Y=234, U=148 and V=195.

Moving on down the production workflow, what happens when this file is imported into DaVinci Resolve? Resolve works in RGB (YRGB color science) so you’d expect the signal to clip when it’s converted from YUV.

It doesn’t because Resolve uses 32-bit float for its working space. Float systems differ from the usual 8, 10, 16 or 32-bit sampling in that they don’t have a ceiling or a floor. Once you get to the top or bottom of the standard signal range, you can keep going, there’s no limit. That’s why Resolve doesn’t clip the signal when you’re grading, no matter how far you push it. That’s also why EXR and DPX float codecs don’t clip on export.

So your YUV rushes are fine while you’re working in Resolve, you always have the full, original signal. You can actually see this with the color picker.

Solutions to Resolve Part 6: Clipped Exports – the YUV enigma 4
Color picker – Resolve 15

(This doesn’t always work, by the way. In some versions of Resolve 16, the picker is clamped at 255 maximum, whatever the real values are. If you have the Studio version and an external monitor, you should be able to get this to work correctly in Project Settings > Master Settings > Video Monitoring by activating the “Retain sub-black and super-white data” option. In the Color Management settings, Broadcast Safe should be default, ie. IRE -20 -120 and “Make Broadcast safe” turned off.)

The problem with clipping in Resolve only happens on export. It looks very much as if the software converts its 32-bit float YRGB working space to standard RGB (which would clip the color channels) before it converts to YUV for codecs like ProRes and DNx. Instead of going straight from 32-bit float YRGB to 8, 10 or 12-bit YUV, there’s an sRGB bottleneck in the middle.

Why this should be so, I have no idea. There are other NLEs that don’t clip the color channels, so this is not inevitable and there should be ways to solve it. Now that Resolve has become a major player in editing as well as color grading, it’s an important issue to fix. Maybe in version 17?

One final note for colorists: This problem also affects the video feed to an external monitor. The colors will clip in the same way as a YUV export, whatever the range you’re working with (16-235 or 16-255). You won’t be able to see the bright, saturated colors you would if you connected the camera directly to the monitor, and there will be a hue shift.

Note: Most of the facts and figures mentioned here were provided by Jean-Pierre Demuynck, including the sample file used for the illustrations. This is as much his article as it is mine!

If you’d like to experiment yourself, you can download the video file used in this article here. The image was assembled with no grading.

[1] Sony and Panasonic cameras use an extended xvYCC color space where the channels can go over 350.

 

Other articles in the Solutions to Resolve series:

-Solutions to Resolve Part 1: Precise sharpening and noise reduction

-Solutions to Resolve Part 2: Accurate shot match in ColorTrace

-Solutions to Resolve Part 3: Unexpected clipping when grading

-Solutions to Resolve Part 4: Don’t clip your proxies and transcodes!

-Solutions to Resolve 5: Taming Color Management – Part 1

-Solutions to Resolve 5: Taming Color Management – Part 2

 

 

 

]]>
https://www.provideocoalition.com/solutions-to-resolve-6-clipped-exports-the-yuv-enigma/feed/ 2
Solutions to Resolve 5: Taming Color Management – Part 2 https://www.provideocoalition.com/solutions-to-resolve-5-taming-color-management-part-2/ https://www.provideocoalition.com/solutions-to-resolve-5-taming-color-management-part-2/#comments Mon, 04 Feb 2019 13:00:42 +0000 https://www.provideocoalition.com/?p=84311 Read More... from Solutions to Resolve 5: Taming Color Management – Part 2

]]>
To pick things up from where we left them at the end of the previous article (Taming Color Management – Part 1) here’s the problem we need to resolve: even with Color Management correctly set up, the results “out of the box” rarely look as good as they do with the equivalent camera LUT. Here’s an example of Log footage from an Arri Alexa camera.

Solutions to Resolve 5: Taming Color Management – Part 2 9
Log footage
Solutions to Resolve 5: Taming Color Management – Part 2 10
Arri LUT applied
Solutions to Resolve 5: Taming Color Management – Part 2 11
Color Managed (basic)

The blown out whites and lifted blacks in the color-managed image are not in fact a mistake. Color Management is doing its job. It’s designed to do what’s called a Color Space Transform. “Transforming” doesn’t mean reducing or adapting a large color space to fit inside a small one, it means matching up whatever’s equivalent. Anything that’s not equivalent, anything that falls outside the limits of the destination color space, is out of range.

In the example above, the highlights captured by Arri Alexa camera go well beyond the limits of the display color space, Rec709 2.4, so they get blown out. They’re not clipped – color management doesn’t clip anything – but you have to adjust them yourself.

Solutions to Resolve 5: Taming Color Management – Part 2 12
Color Space Transform: The image filmed in camera is restored from the Log image. The highlights are out of range of the display color space.

Camera LUTs like the one supplied by Arri don’t just transform the color space, they also reduce and adapt it. It’s called mapping. The camera color space is remapped to fit inside the display color space. Anything that’s out of range is compressed down and some of the in-range signal is compressed as well to make space for it. (Not all camera LUTs do this, by the way. The standard Blackmagic camera LUTs transform but they don’t map.)

Solutions to Resolve 5: Taming Color Management – Part 2 13
Mapping after the Color Space Transform compresses the out of range signal into the display color space.

For Color Management to produce similar results, it needs to be able to map as well. Starting with version 14, DaVinci Resolve has provided several tools to do exactly that.

Method 1 – Gamut Mapping OFX

The first method uses an OFX called Gamut Mapping. Despite its name, the OFX provides controls not only for out-of-range saturation (“Gamut Mapping”) but also for luminance (“Tone Mapping”). The “Gamma” option lets you choose the color space you want to map the luminance to.Solutions to Resolve 5: Taming Color Management – Part 2 14Once you’ve applied the OFX to a node, under Tone Mapping Method choose Simple, and … that’s it! No more blown out highlights. That’s all you have to do. Well, almost. The blacks are still too high, but that can be cured by changing the Gamma option to Use Timeline. And while you’re here, it’s a good idea to activate Saturation Compression in Gamut Mapping Method to control any out-of-range saturation as well.

Solutions to Resolve 5: Taming Color Management – Part 2 15
Tone mapped with “Simple” only
Solutions to Resolve 5: Taming Color Management – Part 2 16
Gamma set to “Use Timeline”
Solutions to Resolve 5: Taming Color Management – Part 2 17
Final settings

So what’s happening here? The Simple option in Tone Mapping is essentially a contrast curve with an S shape, so it compresses both highlights and shadows. It’s designed, like the Arri LUT, to remap a large color space into a small one.

This contrast curve means that working with the OFX is similar to working with a LUT. Changes to the way the highlights and shadows are compressed should happen before the OFX, normal grading can proceed after it. But unlike LUTs, which are applied as the last operation inside a node, OFX are applied as the first operation, so you have to add a node upstream of the OFX node if you want to tweak the signal feeding into it.

Once you’ve set up the OFX, you can save the grade and copy it to all Log clips in your sequence, whatever the camera (Sony, Blackmagic, Arri, Canon, RED, Panasonic, etc.). You don’t need to readjust the settings each time: one size fits all. That’s because the Color Space Transform for each type of camera is managed by the Input color space you’ve assigned to the footage, as explained in the previous article (Taming Color Management – Part 1). All you’re doing with the Gamut Mapping OFX is to add mapping for anything out of range. You may find footage that doesn’t use the whole tonal range a bit dark, but in that case, just correct the Gain.

If you want more customized control of mapping for certain clips, you can switch the Tone Mapping Method to Luminance Mapping and adjust the Max Input Luminance level. Be aware that you’re setting a maximum input level here, so it will clip anything you leave out of range.Solutions to Resolve 5: Taming Color Management – Part 2 18Method 2 – Color Space Transform OFX

This method uses another OFX that also contains mapping controls: the Color Space Transform OFX. You can use it just to map the signal, but in that case the results are not quite the same or as efficient as the Gamut Mapping OFX. Where the Color Space Transform OFX shines is when you use it as a stand-alone to do everything, not just mapping.

Using it as a stand-alone means that you don’t assign an Input Color Space to footage with the right-click contextual menu for clips in the Media Pool or in the Color page. Leave all your footage at the default Rec709 2.4.

The Input Color Space is assigned inside the Color Space Transform OFX itself. You have to assign both “Color Space” (this is in fact gamut – see the footnote in Part 1 for an explanation) and Gamma. Leave Output Color Space and Gamma on Use Timeline, the default setting.

Change the Tone Mapping Method to Simple and activate Saturation Mapping, and you’re good to go. In terms of workflow, everything is exactly the same as for Method 1, just remember to change the Input Color Space and Input Gamma when you copy the node to clips from a different camera.

Solutions to Resolve 5: Taming Color Management – Part 2 19
Final settings
Solutions to Resolve 5: Taming Color Management – Part 2 9
Footage unassigned in the Media Pool
Solutions to Resolve 5: Taming Color Management – Part 2 21
Color Space Transform OFX applied

With this stand-alone approach, you don’t even have to activate Color Management in the Project Settings. The OFX will work fine by itself. You only need to activate Color Management if you’re not working in Resolve’s default Rec709 2.4 color space or want to deliver to a different color space. If, for whatever reason, you decide to activate Color Management after you’ve started grading, just remember not to change the Timeline Color Space from its default Rec709 2.4 (See Taming Color Management – Part 1 for an explanation of all this).

Tone and Gamut mapping in the Project Settings

The Project Settings for Color Management also offer mapping tools. They don’t produce exactly the same results as the two methods above, but they’re interesting for an editor because they affect all footage in the project. Clips in the Media Pool will be acceptably mapped, even if they’re not perfectly mapped, and you won’t have to add an OFX every time you add a clip to the timeline.

Solutions to Resolve 5: Taming Color Management – Part 2 22

If you want to use Tone Mapping in the Project Settings, you’ll need to experiment with the various options. They won’t work in quite the same way as they do with the OFX. For example, the Simple option depends on the Max Timeline Luminance level. A good starting point for Log footage would be to set this somewhere between 300 and 500 nits and then switch back to Simple. You should also be aware that Tone Mapping in the Project Settings will affect all your footage, including clips that are natively Rec709 2.4.

Avery Peck has made a good video tutorial about a Project based approach to Color Management on YouTube:

Legalizing saturation

Here’s a final tip given to me by colorist Donovan Bush.

The Saturation Mapping function in the Gamut Mapping OFX is invaluable for any project, not just ones that are color managed. It’s takes saturation that exceeds the range of your color space and compresses it down until it’s in range. In other words, it “legalizes” the saturation, compressing the peaks so that they fit into broadcast Rec709 2.4’s limits (or whatever your Output color space is).

This is essential for any grading, not just for Log footage, and I systematically add it as a Timeline node so that it acts on all the clips in the sequence I’m grading. Just add and activate it, that’s all you need to do. The OFX has controls for a maximum level as well as a knee (the point at which the compression begins) but in my experience the default levels don’t need to be touched.Solutions to Resolve 5: Taming Color Management – Part 2 23This doesn’t mean you no longer have to watch out for illegal saturation – you can still oversaturate, it’s a compressor not a limiter – but the OFX does 95% of the work for you with smoother and cleaner compression than you get with, say, the sat-sat curves.

And that wraps it up for these two articles on Color Management! It’s a powerful tool and I hope all this will help to make it efficient and easy to use in your grading.

_______________________________

A footnote to clarify Rec709 gamma

Rec709 has a confusing number of different gammas. The gamma for camera sensors is around 1.9, a fairly light gamma. This is referred to in Resolve’s Color Management as Rec709 (Scene). For displays there are two possible gammas: 2.2, intended for viewing in a well-lit environment like an office (the same gamma as sRGB); and 2.4, intended for viewing in a darker environment like a living-room in the evening (but not in the dark, like a cinema). The standard for grading broadcast TV is Rec709 2.4, i.e. Rec709 with a 2.4 gamma.

When Resolve’s interface mentions just Rec709 in relation to gamma, it means Rec709 (Scene). If you want Rec709 2.4, make sure you choose the option that says 2.4.

That’s why the Gamma setting in Method 1 had to be changed to Use Timeline (or to 2.4): With the default Rec709 the OFX assumes you want to map the luminance to Rec709 (Scene), so the blacks are too light.

_______________________________

A French version of this article can be found on the author’s website, www.pixelupdate.com at the following link: http://www.pixelupdate.com/color_management_2

Other articles in the series Solutions to Resolve:

 

 

]]>
https://www.provideocoalition.com/solutions-to-resolve-5-taming-color-management-part-2/feed/ 1
Solutions to Resolve 5: Taming Color Management – Part 1 https://www.provideocoalition.com/solutions-to-resolve-5-taming-color-management-part-1/ https://www.provideocoalition.com/solutions-to-resolve-5-taming-color-management-part-1/#comments Mon, 28 Jan 2019 15:35:05 +0000 https://www.provideocoalition.com/?p=83515 Read More... from Solutions to Resolve 5: Taming Color Management – Part 1

]]>
Color management in Resolve can be frustrating. In theory it should be a great alternative to using LUTs, but in practice it can produce surprising results with the blacks lifted and the whites going through the roof. Here’s an example of a Log encoded Arri Alexa clip:

Solutions to Resolve 5: Taming Color Management – Part 1 28
Log footage
Solutions to Resolve 5: Taming Color Management – Part 1 29
Arri LUT applied
Solutions to Resolve 5: Taming Color Management – Part 1 30
Color Managed

You can manually correct this, of course, but it’s complicated and takes time, and time is a luxury for a colorist. When you’re grading several hundred clips a day, you want a decent image “out of the box” so you can get on with the creative work.

These two articles are about getting Resolve’s Color Management to work as efficiently as LUTs. In the second article, I’ll be proposing two methods to do this, each with slightly different results. But for either method to work you need to get Color Management set up correctly, so that’s what this first article is about.

Don’t be put off by the length of the article, by the way. There’s nothing complicated about using Color Management. On the contrary, it’s dead simple, as you’ll see if you skip to the “To sum up” section at the end. But to get the most out of it, there’s a lot to explain!

The basics of color space, LUTs and color management

There’s neither the time nor the space in this article to explain color spaces in any depth, but here’s a very brief summary of what we’re trying to deal with.

Color Management is all about converting from one color space to another. A camera’s color space is the range of colors and contrasts its sensor can capture (see the footnote at the end of this article for a fuller explanation). Consumer cameras have a small color space, high-end cameras have a much larger one. Camera sensors differ from one manufacturer to another and the technology improves all the time, so there are a bewildering number of camera color spaces around.

The color space of a display (TV screen, monitor, cinema projector, etc.) is the range of colors and contrasts the display can reproduce. Unlike cameras, where the unique capabilities of a sensor are a selling point, display color spaces are defined by international standards. The standard color space for broadcast television is known as Rec709 2.4, and for cinema projection it’s P3-DCI.

When the camera color space and the display color space are the same, nothing needs to be converted. That’s the case for most cameras, including smartphones, DSLRs and even some professional models. They all record in Rec709 2.4 color space, so they don’t need to be color managed.

The problem is that the color spaces of high-end cameras are larger than both Rec709 2.4 and P3-DCI. Recording in a small color space would limit their performance. The solution to that is Log encoding, essentially a method of conserving the full range of a camera’s color space for both contrasts and color. The Log footage then needs to be converted in one way or another to the display color space, either by Look Up Tables (LUTs) or by Color Management.

LUTs are versatile and easy to use, but using a table with a limited number of input and output values to convert a large color space to a small one is like pushing the signal through a grid. A lot of information in the original signal gets rounded off or clipped.

Color Management uses math equations rather than tables to do the conversion, so the full, original signal remains available throughout the grade.

Color Management in Resolve

Color Management is applied to the entire project, so it needs to be activated in the Project Settings.

To do this, go to Project Settings > Color Management, and change the Color Science drop-down menu to DaVinci YRGB Color Managed. The Input, Timeline and Output settings all then become active. Each one has a drop-down menu with a list of color spaces to choose from.

Solutions to Resolve 5: Taming Color Management – Part 1 31

Setting the Input color space

The Input color space is the camera color space, the color space your footage was filmed in. In previous versions of Resolve, you would usually change the setting here to the color space of your source clips. All clips in the Media Pool were assigned by default to the Project input color space, so whatever you chose here would automatically be applied to them.

Solutions to Resolve 5: Taming Color Management – Part 1 32

In Resolve 15, there’s been a major change: clips imported to the Media Pool are assigned by default to Rec709 2.4, so the Project color space doesn’t automatically apply to them. You have to manually assign the color space for each clip. That means there’s not much point in changing the Input color space in Project Settings, you can leave it at the default Rec709 2.4.

To manually assign a color space to a clip, right-click on the clip either in the Media Pool or in the Color page timeline and choose the appropriate setting in Input Color Space (this option only appears in the contextual menu when Color Management has been activated in the Project Settings).

Solutions to Resolve 5: Taming Color Management – Part 1 33
Assigning Input Color Space to footage in the Media Pool

The process may sound laborious, but in practice it’s not. You can select any number of clips and change the Input Color Space for them all at the same time. If you have sources from different cameras, you can probably separate them out using Smart Bins based on the codecs. For any clips that are natively Rec709 2.4 (usually any clips that aren’t Log) you don’t need to do anything, they’re already assigned to the correct color space.

Setting the Output color space

While you’re grading, this should correspond to the color space of your display. That’s true whatever kind of project you’re working on and whatever display you’re using, be it a monitor calibrated for Rec709 2.4, a cinema projector working in P3-DCI, or an iMac (usually sRGB color space). That way, you can actually see what you’re doing.

Solutions to Resolve 5: Taming Color Management – Part 1 34

If you decide to switch to a different display in the middle of your grade – you may have done some initial work on an iMac, moved to a professional Rec709 2.4 suite, then want to see what the grade looks like on a P3-DCI projector – change the Output color space each time to match the new display. Resolve’s Color Management will take care of the necessary conversions, and the grade should look the same on each of the displays. That’s the beauty of Color Management, and one of the main advantages of using it.

Similarly, once you’ve finished your grade you can change the Output to deliver to different standards. For example, if you’ve graded in Rec709 2.4 for broadcast TV and want to export a version for the web you can switch the Output color space to Rec709 2.2 or to sRGB.

The Output color space is also the color space of your scopes, so the signal on the scopes will change when you select a different output. This is normal behavior and nothing to worry about!

Setting the Timeline color space

This is Resolve’s working color space, and I’ve left it till last because it’s a strange beast, very different from the other two.

Solutions to Resolve 5: Taming Color Management – Part 1 35

If you haven’t done any grading, it’s transparent. You can change the Timeline color space as much as you like, nothing will change in the picture. It only kicks in when you start grading.

While you’re grading, the Timeline color space affects everything you do. It defines the range of colors and contrasts you work with, and it’s the reference color space for all your controls. Any changes you make to gain, contrast, saturation, etc. are made in relation to that color space.

If the idea of a reference color space sounds obscure, think of it this way: When you adjust a control like Lift from 0.00 to 0.05, those numbers need to refer to something if they’re to make any sense. To draw an analogy, imagine opening a spreadsheet with numbers on it but no mention of whether these numbers are yens, dollars or euros. You need to know which currency the numbers refer to to know how much they’re worth. And switching from one reference to another – dollars to yens, for example – would change the amount considerably!

Similarly in Resolve, when you adjust a control, the reference for the numbers is the Timeline color space. If you change the color space the same numbers will give a different result.

Here’s an example with some Log footage. The two images below have exactly the same correction applied to them: Gain reduced to 0.40. To the left, the Timeline color space is Rec 709 2.4. To the right it’s been changed to Rec 709 (Scene), which has a lighter gamma of about 1.9. The grade is the same but the result is very different.

Solutions to Resolve 5: Taming Color Management – Part 1 36
Timeline color space Rec709 2.4                – same grade –                        Timeline color space Rec709 (Scene)

The crucial point to take away from this is that you must never change the Timeline color space once you’ve started grading. If you do, it will alter every single grade you’ve made.

It also means that the Timeline color space will influence the way your grading controls behave and feel. You may have to push a contrast or adjust a color balance differently to achieve the same result in another color space.

So which Timeline color space should you choose? The obvious answer is to set it to the color space you’re delivering to: Rec709 2.4 for broadcast TV or P3-DCI for cinema.

Those two color spaces are interesting to compare as choices for a Timeline color space because they don’t have the same white point. The white point is the starting point for any changes you make to color, so if you grade in one of the two color spaces and then switch to the other, you’ll see a shift in the hues. This will be more or less pronounced depending on the amount and type of changes you’ve made to the grade.

Solutions to Resolve 5: Taming Color Management – Part 1 37
Timeline color space Rec709 2.4                 – same grade –                            Timeline color space P3-DCI

Resolve’s default color space

Resolve needs a color space to function, even when Color Management is turned off. All media has a color space of some sort, the Timeline controls need a reference, the Output color space has to be something. Resolve’s default color space when Color Management off is Rec709 2.4 for everything: Input, Timeline and Output. That’s also what’s assigned by default to all three when you turn Color Management on. It’s also the default for all imported media in version 15.

That means that Color Management is “transparent” when you first turn it on. It won’t alter anything you’ve done so far. You can activate it at any time to use the Output color space for a new display or to export to different versions. You can also activate it and assign an input color space to just a few clips. Mix and match as you wish, with some clips color managed and others using LUTs.

To sum up

After the detailed explanations, here’s a summary of the standard set-up:

  • In Project Settings > Color Management, switch Color Science to DaVinci YRGB Color Managed
  • Leave Input color space set to Rec709 2.4, set the Timeline to Rec709 2.4 for broadcast TV or to P3-DCI for cinema, and the Output to the color space of your calibrated display. Save settings.
  • In the Media Pool, select all clips with the same color space, right click and assign the correct color space in Input Color Space. Repeat for clips with a different color spaces. Do nothing for clips with native Rec709 2.4 color space (the usual standard for any clips that aren’t Log).

And that’s it. Well almost. The result you get is the one shown at the beginning of this article. It doesn’t always look good “out of the box”, so that’s what we need to fix next and the subject of the second article. It’s surprisingly easy, by the way!

 

A footnote to clarify technical terms for color spaces

My explanation of color spaces in this article uses the terms “colors” and “contrasts” to simplify things. By “range of colors”, what I mean is the range of hues and saturations, also known as gamut. By “range of contrasts” I mean the range of light from shadows to highlights that a camera can capture without over or underexposing (a.k.a. Dynamic Range) as well as the type of contrast applied to it. In color management, this is generally referred to as gamma.

Color space is both gamut and gamma, but whenever Resolve separates the two with individual controls for each one, it labels them “color space” and “gamma” respectively. So in these cases, “color space” means gamut only. This is fairly common practice, by the way. A lot of articles and tutorials use the term “color space” to mean gamut only and treat gamma as something different.

Other articles in the series Solutions to Resolve:

 

 

A French version of this article can be found on the author’s website, www.pixelupdate.com at the following link: http://www.pixelupdate.com/color_management_1/

 

 

 

]]>
https://www.provideocoalition.com/solutions-to-resolve-5-taming-color-management-part-1/feed/ 12
Solutions to Resolve Part 4: Don’t clip your proxies and transcodes! https://www.provideocoalition.com/solutions-to-resolve-part-4-dont-clip-your-proxies-and-transcodes/ https://www.provideocoalition.com/solutions-to-resolve-part-4-dont-clip-your-proxies-and-transcodes/#comments Mon, 29 Oct 2018 12:30:16 +0000 https://www.provideocoalition.com/?p=78510 Read More... from Solutions to Resolve Part 4: Don’t clip your proxies and transcodes!

]]>
Articles in this series:

Full data and video levels in DaVinci Resolve – don’t clip your proxies and transcodes!

This article is about the difference between Full data and Video levels and how Resolve’s video pipeline deals with them. Most of the time this is an automatic process and there’s nothing to do, but there can be problems with clipping when you’re not grading with Resolve but just using it to transcode your rushes or to make proxies.

I’ll start with a quick explanation of Full and Video levels for those who find them confusing. If you’re familiar with this, skip down to the section on Resolve’s video pipeline.

Full and Video levels

There are two standards for digital images. One is for photos and graphics, the other is for video. Properly displayed, a picture will look identical in both standards. The difference is the way each of them defines what’s black and what’s white.

Photos and graphics (RGB files) use Full data levels. If you look at this on a waveform monitor, you’ll see the luminance takes up the full digital data range available. Black is 0 and white is 255 for an 8-bit file (0 to 1023 for a 10-bit file). (Technically, black is in fact 4 in both cases since the values from 0 to 3 are reserved for other data)

Solutions to Resolve Part 4: Don’t clip your proxies and transcodes! 42

Video (usually YCbCr files, aka YUV) uses the same data range, but sets the level for black at 16 and white at 235 (64 and 940 respectively for a 10-bit file). This leaves some headroom at the top and the bottom in case the lighting changes during a shot and the signal spills over the limits. Anything below 16 (64 in 10-bit) is known as “sub-black” and anything above 235 (940 in 10-bit) is known as “super-white“.

(Note: Video levels are sometimes referred to as “Legal” or “Rec709” levels in other applications)

Solutions to Resolve Part 4: Don’t clip your proxies and transcodes! 43

Dealing with these two standards is usually straightforward. The software will interpret files with a codec for still images (jpg, png, dpx, etc) as Full levels and files with a video codec (H264, ProRes422, DNxHD etc.) as Video levels, and it will display the blacks and whites accordingly. That’s why an image will appear identical in each of the two standards.

Resolve’s video pipeline

Most video editing applications use Video levels as their working space. The signal of a video file remains the same from import to timeline to export. All the information, including the super-whites and sub-blacks are conserved throughout the process. If you import a file with Full data levels, the NLE will automatically convert these to Video levels, lowering the whites from 255 to 235 and raising the blacks from 0 to 16. Nothing is clipped.

DaVinci Resolve, on the other hand, uses Full data levels as its working space. It converts media with Video levels to Full data levels in the timeline, and then back to Video levels when exporting on the Delivery page.

As this is an unfamiliar way of working, it’s worth going through the process in detail. Think of it as a pipeline with three stages: the input (clips in the Media Pool), the working space (Edit page, Color page, etc.) and output (Delivery).

In the Media Pool, input is controlled in “Clip Attributes”: “Auto” will assign Full or Video levels depending on the clip file’s codec, as explained above. “Full” will force the clip to Full levels and “Video” to Video levels.

Solutions to Resolve Part 4: Don’t clip your proxies and transcodes! 44
Media Pool, Clip Attributes

 

Solutions to Resolve Part 4: Don’t clip your proxies and transcodes! 45
Delivery page, Advanced Settings

There are similar settings for output on the Delivery page in the Advanced Settings of the Video tab. Again, “Auto” will choose the appropriate standard for the video codec you’ve chosen for export, “Full” and “Video” will force Resolve to use one or the other, whatever the codec.

The working space can’t be changed, it always uses Full data levels.

Using default “Auto” in Clip Attributes and Delivery

These are the various combinations Resolve applies automatically with the default “Auto” setting in both Clip Attributes and the Delivery page:

Note: The waveform illustrations below are essentially diagrams to show what’s happening. You won’t actually see the waveform changing between pages in Resolve. Resolve’s scopes will always display the working space, whatever page you’re on.

Solutions to Resolve Part 4: Don’t clip your proxies and transcodes! 46
For example, a jpg sequence to a dpx sequence (both are Full codecs)
Solutions to Resolve Part 4: Don’t clip your proxies and transcodes! 47
For example, an H264 video to a ProRes422 video (both are Video codecs)
Solutions to Resolve Part 4: Don’t clip your proxies and transcodes! 48
For example, a jpg sequence to a ProRes422 video
Solutions to Resolve Part 4: Don’t clip your proxies and transcodes! 49
For example, an H264 video to a dpx sequence

Usually this works fine, but there’s a catch with this system when video files have super-whites and sub-blacks (shown here in red).

Solutions to Resolve Part 4: Don’t clip your proxies and transcodes! 50

The super-whites and sub-blacks are pushed outside the limits of Resolve’s working space, so they’ll be clipped on Delivery if you don’t do anything about them.

Solutions to Resolve Part 4: Don’t clip your proxies and transcodes! 51
Video levels sub-blacks and super-whites clipped by Resolve’s working space

When you’re grading, that’s not a problem. You’ll notice the whites are clipping on the scopes and lower the gain to bring them back, idem for lifting the blacks.

But if you’re not grading and simply using Resolve to transcode or to generate proxies, it is a problem. The media you deliver will be clipped if you leave the settings on Auto.

Solutions to Resolve Part 4: Don’t clip your proxies and transcodes! 52

The work-around for this before Resolve 15 was to select all your clips in the Media Pool and manually set them to Full in Clip Attributes and then force the Advanced Settings in Delivery to Full as well. That way, the signal goes through the pipeline with no changes (Full to Full to Full) and the rendered clips have the same levels as the original media with no clipping.

Resolve 15 has mostly fixed the issue with a new check box in the Video > Advanced Settings of the Delivery page: “Retain sub-black and super-white data”.

Solutions to Resolve Part 4: Don’t clip your proxies and transcodes! 53
Video > Advanced Settings

When this is active, the whole process works as it should: Clip Attributes and Delivery can both be left on Auto, the super-whites and sub-blacks will be preserved.

Solutions to Resolve Part 4: Don’t clip your proxies and transcodes! 54
Video levels export with sub-blacks and super-whites preserved

But this option depends on whether the codec you choose for export uses Full or Video levels. It’s available for delivering to Video level codecs but grays out with Full level codecs. So if you want to export a video file with super-whites to a dpx sequence for the vfx team, for example, the super-whites will be clipped.

Solutions to Resolve Part 4: Don’t clip your proxies and transcodes! 55
Retain sub-black and super-white data” unavailable on export to a Full data codec

In fact, it’s right that the option should be grayed out for this. There is no headroom for super-whites and sub-blacks in Full data, so they’ll be clipped by definition. The work-around is, again, to set the Clip Attributes for the media to Full, then nothing will be lost in the pipeline. However, the images in the dpx sequence will lose contrast since the white levels have been pushed down and the black levels have been pushed up. You have a Video levels signal in a file that will be read as Full levels, and you’ll need to warn the vfx team about this. Alternatively you can leave the Clip Attributes on Auto and grade the clip to correct the highlights and shadows before exporting.

So whatever strategy you choose, the fact that the option is grayed out warns you that the codec you’ve selected isn’t designed for super-whites and sub-blacks, you have to manage them yourself.

In Resolve 15, Media Management has also been updated in this respect. The options Move and Copy don’t clip super-whites but Transcode used to. There’s now the same check box for “Retain sub-black and super-white data” and it also grays out with Full data level codecs.

The essentials in working practice

This has been a long article, so for those who want simple, practical conclusions, they are:

  • If you’re grading, you manage clipping yourself, so carry on as usual. Don’t activate the option to “Retain sub-black and super-white data” because sub-blacks and super-whites are not broadcast legal and should be clipped if you’re delivering a final Master.
  • If you’re delivering proxies or transcoding rushes without grading and the option “Retain sub-black and super-white data” is available, activate it. If it’s grayed out, either change the delivery codec or manage the sub-blacks and super-whites yourself, as suggested above.
  • Whatever application you use, not just in Resolve, you have to manage sub-blacks and super-whites yourself if you’re transcoding from a Video levels file to a Full levels codec.

A final observation: In the ever-expanding jungle of video codecs, it’s not easy for software to automatically detect whether the file has Full or Video data levels. Not all RGB (444) files have Full levels, not all YUV (422, 420) files have Video levels. NLEs do a remarkable job, but sometimes make mistakes. A picture that lacks contrast with the blacks unexpectedly high and the whites unexpectedly low could be a Video levels file being interpreted as Full. A shot with crushed blacks and clipped whites could be a Full levels file interpreted as Video. Both errors are easy to correct in Resolve’s Clip Attributes. It can get more complicated in other applications.

 

A French version of this article can be found on the author’s website, www.pixelupdate.com at the following link: http://www.pixelupdate.com/data-levels/

 

]]>
https://www.provideocoalition.com/solutions-to-resolve-part-4-dont-clip-your-proxies-and-transcodes/feed/ 7
Solutions to Resolve Part 3: Unexpected clipping when grading https://www.provideocoalition.com/solutions-to-resolve-part-3-unexpected-clipping-when-grading/ https://www.provideocoalition.com/solutions-to-resolve-part-3-unexpected-clipping-when-grading/#comments Mon, 22 Oct 2018 07:01:39 +0000 https://www.provideocoalition.com/?p=78500 Read More... from Solutions to Resolve Part 3: Unexpected clipping when grading

]]>
Articles in this series:

Unexpected clipping in DaVinci Resolve and how to avoid it!

One of the great features of Resolve is that it works in 32bit float. This means that it can cope with a virtually limitless dynamic range, and, most importantly, it will conserve that range however much you raise or lower the video levels from one node to another.

Here’s a shot where a correction on node 2 has burned out the highlights and crushed the blacks. But nothing has actually been lost. If you bring down the whites and lift the blacks on a later node, all the details return. Nothing has been clipped.

 

Solutions to Resolve Part 3: Unexpected clipping when grading 58
Raised whites and lowered blacks in node 2

 

Solutions to Resolve Part 3: Unexpected clipping when grading 59
Restored whites and blacks in node 3

This works so well in Resolve that it’s easy to get comfortable and assume that however much you torture your signal from one node to the next, you’ll never lose anything. But there are times when Resolve does clip, and you may not notice when it happens.

The first situation is caching. You can activate caching in Resolve for anything that doesn’t run in real time on your machine, effects that require a lot of processing power like noise reduction or RAW video formats. When you activate caching, Resolve creates a new video file with the processing already done and stores it in the CacheClip folder on your main hard drive. From then on, it reads that file rather than the original media when you play the clip back or add further corrections downstream.

The problem is that, by default, when Resolve makes that new video file, it discards any part of the signal outside of what you can see on your Waveform and RGB Parade scopes. It clips. In the example I gave above, if I cache the node 2 with burnt highlights (a node’s name turns blue when it’s cached), the whites and blacks will be clipped when I try to correct the levels on the following node.

Solutions to Resolve Part 3: Unexpected clipping when grading 60
After caching node 2 – clipped whites and blacks in node 3

There is a solution. If you go into your Project Settings > Master Settings > Render Cache Format and change the codec to one of the codecs marked HDR (High Dynamic Range), the cached video files no longer clip. You recover your details with no problem at all.

Solutions to Resolve Part 3: Unexpected clipping when grading 61

Solutions to Resolve Part 3: Unexpected clipping when grading 62
After HDR caching node 2 – restored whites and blacks in node 3

Only codecs marked HDR allow this, all of the other codecs in the Render Cache Format will clip. This is something you have to take care of yourself because, by default, the cache codec in a new project is ProRes422 HQ for a Mac and DNxHR HQX for a PC. Neither is labeled HDR, both of them clip. (This is also true for the Optimized Media codecs.)

The downside of this is of course that the HDR codecs are heavy codecs and will take up a lot more space on your disk than the lighter, non-HDR codecs. The CacheClip folder will grow quickly, so make sure it’s on a large, fast drive. Also this is not something you can manage by switching between HDR and non-HDR codecs as needed during your grade. Every time you change the codec in Project Settings, Resolve will re-cache every instance of caching in your timeline. Personally I set this to an HDR codec whatever the project so I can forget about it – there are too many other things to concentrate on when you’re grading!

The second situation when Resolve clips unexpectedly is with Soft Clip. And with a name like that, this shouldn’t really be unexpected! These controls are at the bottom right of the Custom Curves palette. The L.S. (low soft) and H.S. (high soft) are popular tools to compress dark shadows and bright highlights respectively. The Low and High controls just above them set the level at which the compression starts.

Solutions to Resolve Part 3: Unexpected clipping when grading 63

Activating any of those four controls clips the signal. It doesn’t matter how strongly you apply them, any value other than default will clip the signal. And it’s not a question of clipping just the blacks or the whites, both will be clipped.

A typical situation would be this: In the following example, I’ve used High Soft (H.S.) to squeeze a few more details out of the highlights. With my attention focused on the highlights I haven’t noticed that my contrast adjustment in node 1 was too extreme and has pushed the blacks below 0.

Solutions to Resolve Part 3: Unexpected clipping when grading 64
Crushed blacks in node 1, High Soft (H.S.) only in node 2

When I try to restore the blacks on a later node, they’ve been clipped.

Solutions to Resolve Part 3: Unexpected clipping when grading 65
Lifted blacks in node 3 clipped by High Soft in node 2

So Soft Clip is really the last thing to do in your grade, a final touch to put on the last node. It can also be useful as a Timeline rather than a Clip grade, to help to give a softer fall-off to the extreme highlights and shadows in your film at the very end of the grade pipeline.

One final word: None of this matters if there’s nothing to clip. If your entire signal fits nicely into the limits of your Waveform and RGB Parade scope (don’t forget this one, the Parade channels clip more often than the Waveform luma) there’s no information to lose when you cache.

 

A French version of this article can be found on the author’s website, www.pixelupdate.com at the following link: http://www.pixelupdate.com/clipping/

]]>
https://www.provideocoalition.com/solutions-to-resolve-part-3-unexpected-clipping-when-grading/feed/ 1
Solutions to Resolve Part 2: Accurate shot match in ColorTrace https://www.provideocoalition.com/solutions-to-resolve-part-2-accurate-shot-match-in-colortracetm/ https://www.provideocoalition.com/solutions-to-resolve-part-2-accurate-shot-match-in-colortracetm/#respond Mon, 15 Oct 2018 15:00:11 +0000 https://www.provideocoalition.com/?p=78492 Read More... from Solutions to Resolve Part 2: Accurate shot match in ColorTrace

]]>
Articles in this series:

The key to accurate shot match in DaVinci Resolve’s Color Trace

ColorTraceTM is a tool in DaVinci Resolve to copy color grades from one timeline to another. It’s often used when a sequence has been graded and the grades then need to be transferred to different versions, perhaps a trailer or an updated cut of the sequence itself.

There are two tabs in the ColorTrace interface, an Automatic mode where ColorTrace tries to match up clips that correspond between the two timelines, and a Manual mode where you do the matching yourself. The problem is that the Automatic mode won’t match clips well if your settings are wrong. This article is about getting it to work properly.

Solutions to Resolve Part 2: Accurate shot match in ColorTrace 72

To access ColorTrace in Resolve 15, load your target timeline (the one you want grades copied to) in the Edit page. Then right-click the icon for the target timeline in the Media Pool and choose Timelines > Color Trace > Color Trace from Timeline. Choose the source timeline (the one with the grades you want to copy from) in the windows that follow. (Note: Save your project before you do this, ColorTrace can have problems accessing timelines in a project with changes that haven’t been saved.)

ColorTrace’s Automatic mode comes up by default. Clips with a match are outlined in green, those with no match in orange, and clips with several possible matches are outlined in blue. When you select a blue clip, the matching source clips appear above and you can choose the correct one with a double-click.

The problem is that ColorTrace doesn’t always match clips correctly. In this example of a blue clip, ColorTrace’s default choice (outlined in purple) is not the right clip.

Solutions to Resolve Part 2: Accurate shot match in ColorTrace 73

The mystery thickens when you look at green clip #3 – green because it’s supposed to have a single, correct match – and see that ColorTrace has matched it up with a completely different shot. What’s going on?

Solutions to Resolve Part 2: Accurate shot match in ColorTrace 74

Just below the Target Timeline, there’s a box where you can compare information about the source and destination clips. The two clips here have completely different names. So clearly, ColorTrace takes no account of the file name when it makes a match.

Solutions to Resolve Part 2: Accurate shot match in ColorTrace 75

In fact, the only criteria it’s using here is timecode, and even then, not exact timecode. The simple fact that the two timecodes partially overlap is enough for ColorTrace to decide it has a match (which is logical, as edit points and durations for clips could well be different in different versions of a sequence).

There’s one set of data at the top of the information box that’s completely missing, and that’s the Reel name. This is the key to the problem. ColorTrace was designed for film and tape workflows where a clip is identified by its reel and timecode. It has not (yet?) adapted to a file-based workflow where the main identifier for a clip is the file name.

To solve the problem, you need to get those file names copied to the reel names. You can do that in Project Settings > General Options. The setting to activate here is Conform Options > Assist using reel names from the > Source clip filename. As soon as you do that, the reel metadata for the clip populates with the file name. You can check this by looking at the reel column in the Media Pool set to list view.

Solutions to Resolve Part 2: Accurate shot match in ColorTrace 76

You can change these settings at any time, by the way, not just when you begin a project. Simply exit ColorTrace, correct the settings, save the project and then go back into ColorTrace.

Now the errors have gone. ColorTrace is matching both reel names and timecode. What previously appeared as a blue clip (#4) has now been correctly identified, and the wrongly matched green clip (#3) is now orange, showing correctly that there is no corresponding clip in the source timeline.

Solutions to Resolve Part 2: Accurate shot match in ColorTrace 77

With the Automatic mode now working properly, green clips should be reliable and you should only have to deal with the blue clips (these will be two or more clips taken from the same media file) before pasting the grades to your target timeline.

One last tip: Sometimes the thumbnails in ColorTrace show up grey instead of showing an image of the shot. To correct this, go to the Color page, click right on any thumbnail and choose “Update all thumbnails”. When you go back to ColorTrace they should show up correctly.

 

A French version of this article can be found on the author’s website, www.pixelupdate.com at the following link: http://www.pixelupdate.com/color-trace/

 

]]>
https://www.provideocoalition.com/solutions-to-resolve-part-2-accurate-shot-match-in-colortracetm/feed/ 0
Solutions to Resolve Part 1: Precise sharpening and noise reduction https://www.provideocoalition.com/solutions-to-resolve-part-1-precise-sharpening-and-noise-reduction/ https://www.provideocoalition.com/solutions-to-resolve-part-1-precise-sharpening-and-noise-reduction/#comments Mon, 08 Oct 2018 15:00:58 +0000 https://www.provideocoalition.com/?p=77951 Read More... from Solutions to Resolve Part 1: Precise sharpening and noise reduction

]]>
This short series of articles is about various aspects of Resolve that are important but not as well known as they should be. They don’t often come up in tutorials and I’ve discovered in the course of my teaching that even some confirmed editors and colorists are not aware of them.

The four articles are bit of a mixed bag, some of them working tips, others more fundamental as to how Resolve works. Each of them is independent with no relation to the others. I’ve tried to provide sufficient background information to make them clear for beginners as well as professionals, so pick, skip and choose whatever you find useful!

I work for Blackmagic as a Master Trainer, but Blackmagic hasn’t commissioned these articles, they’re my own initiative.

The four articles are:

The key to precise sharpening and noise reduction in DaVinci Resolve

Sharpening can be tricky to get right. You rarely want to sharpen all the details in a picture, just some of them – the eyes, for example, but not the skin.

Solutions to Resolve Part 1: Precise sharpening and noise reduction 82

Resolve has tools to do this, the “Level” and “Coring Softness” controls at the bottom of the sharpening palette. “Level” is a threshold for details. Raise it to exclude finer details and keep only the larger ones. “Coring Softness” will create a soft transition around the threshold rather than an abrupt cut-off.

Solutions to Resolve Part 1: Precise sharpening and noise reduction 83

The problem is that it’s never easy to see what you’re doing with these controls and to get the balance between Level and Coring right. That’s where the Highlight A/B mode makes all the difference.

Add some sharpness by bringing the Radius control down a little, and then activate the Highlight button at the top left of the viewer and switch to A/B mode at the top right. Now you can now actually see what you’re doing. Anything that’s neutral grey is not being affected, anything that stands out with contrast is.

Solutions to Resolve Part 1: Precise sharpening and noise reduction 84
Coring Softness: 0.00                    Level: 0.00
Solutions to Resolve Part 1: Precise sharpening and noise reduction 85
Coring Softness: 30.00                    Level: 43.00

Once you’re found the right balance between Level and Coring, leave the Highlight mode and watch the image to decide how much Radius and Scaling to use. Highlight A/B will show you what you are affecting and where to look, but it’s no substitute for your eyes when deciding how much sharpening to apply!

So what exactly does the A/B mode do? It shows you the difference the corrections in a node are making to the picture. If there are no corrections, the screen will be a uniform grey, but as soon as you add a correction, this will stand out against the grey background.

A/B is particularly useful whenever you have to set a threshold for an effect. Take spatial noise reduction, for example. Typically a colorist will raise the luma and chroma thresholds until the noise she or he wants to remove from the picture disappears. But that level may also be affecting other parts of the image that should be left alone. In this shot of the Supertrees in Singapore, the aim is to reduce noise in the shadows without affecting the details of the trees. The A/B mode will show you if the threshold is set too high.

Solutions to Resolve Part 1: Precise sharpening and noise reduction 86

Solutions to Resolve Part 1: Precise sharpening and noise reduction 87
Luma & chroma thresholds too high
Solutions to Resolve Part 1: Precise sharpening and noise reduction 88
Luma & chroma thresholds corrected

The A/B mode is useful too with temporal noise reduction. Temporal noise reduction works by comparing frames before and after the frame you’re on. If a pixel is different in the next and previous frames, there’s a good chance that it’s noise, and if it’s the same it probably isn’t. That of course becomes difficult if things in the picture move! So temporal noise reduction has several controls to identify movement and exclude it from the analysis: Motion Estimation Type, Motion Range and Motion.

The A/B mode helps to decide how to set these controls. You can use it to compare different settings to see which one eliminates movement most effectively for your clip. This example compares no motion detection (Motion Estimation Type set to “None”) with Motion Estimation Type set to “Faster” and two variants of Motion Range: “Small” and “Large”.

Solutions to Resolve Part 1: Precise sharpening and noise reduction 89

Solutions to Resolve Part 1: Precise sharpening and noise reduction 90
Motion Estimation Type: None
Solutions to Resolve Part 1: Precise sharpening and noise reduction 91
Mo. Est. Type: Faster                     Motion Range: Small
Solutions to Resolve Part 1: Precise sharpening and noise reduction 92
Mo. Est. Type: Faster                     Motion Range: Large

As with sharpness, the A/B mode is not going to show you exactly how much noise reduction to apply. It may in fact be fine to affect some picture details and motion, only the picture itself can tell you that. What the A/B mode does do is to help control what you’re doing, and show you where to look and what to look for while you’re adjusting the levels.

There are other instances in Resolve when the Highlight A/B mode comes in handy. This could be to set the ranges for Shadow/Midtone/Highlight in Log, to check what’s being affected by the hue-hue curves, or to finely adjust a Resolve FX like Glow. In general, whenever there’s a threshold you’re not sure how to adjust, the A/B mode could well be your answer!

 

A French version of this article can be found on the author’s website, www.pixelupdate.com at the following link: http://www.pixelupdate.com/a-b_mode/

 

]]>
https://www.provideocoalition.com/solutions-to-resolve-part-1-precise-sharpening-and-noise-reduction/feed/ 3