With High Dynamic Range and Wide Colour Gamut videos around the corner, never has colour science been such an integral part of the Post-Production discussions.
Colour grading tools are now available to everyone, and every filmmaker is now paying attention to the ‘Look’ of their film.
Editor, colourist and guest writer Julien Chichignoud caught up with Lars Borg, Principal Color Scientist at Adobe, during the recent SMTPE conference in Sydney, to talk about his role at Adobe, and get a glimpse of the future of colour grading for Colourists and the everyday user.
This is the the second half of our series on the future of Adobe and what that means for indie filmmakers.
These days, it seems like the trending buzzword in colour correction is ‘Look’. What makes the ‘Look’ of a film and why is it such an identifying feature?
Once upon a time I had no idea what a look was. I don’t think there is a good definition for it but I’m now comparing it to the music of the film: it’s there to create extra drama.
One of the classic examples of music as a look is Wagner’s Nibelungen Ring operas, where you know which character is coming because the music type changes. That’s the same thing with cinematography looks, it’s trying to create a feeling.
You can tell something terrible is going to happen in a scene if it looks darker than the previous one. The gangsters are always in a dark look, the poor unsuspecting victims are in a bright blissful look, etc.. This is all about creativity, and the challenge is preserve that creative look through the rest of the pipeline. That’s where colour management comes in.
I’m dividing it into 3 categories: Colour Correction which is the technical step to get the cameras to look the same and be accurate, Creative Look, which is the artistic stage to give a film its specific tone, and Colour Management, which is about preserving colour decisions throughout the rest of the pipeline.
How did Adobe Hue come about?
We were looking for a more intuitive way to do colour correction. A lot of our users don’t understand RGB sliders, vector form and vectorscopes. You shouldn’t have to be a colour scientist to say “oh that needs to have more red over there”.
It’s an interesting challenge to come up with colour grading tools for the novice. Hopefully we’ve done some of that and It will be fun for us to see what we more can do with this bubble chart.
At the moment we’re just using it at the capture stage, but we may be able to use it in other ways…
Those bubbles… How do they work?
The vertical axis represents the luminance. The size represents the amount of that colour, and the placement is visualising the Hue/Saturation space.
We preserve the luminance and only change the saturation, and the bubble selection is a colour shift for the midtones. For example, picking an orange bubble shifts the midtones towards orange, and its effect progressively fades out in the highlights and shadows to avoid creating artifacts.
How does Hue work for flat/log images if there is no change to the luminance?
This is not for a colour managed pipeline at this time. This is really designed for an sRGB device like the iPad. Our typical users are shooting Rec709 video, and it works for that purpose.
At the moment, the colour pipeline in Premiere works well but it’s a black box with no possibility for the user to change settings…
True. Right now Premiere is colour agnostic. It was okay when everything we did was Rec709, but now with UHD and the wider colour gamut, you are going to have to switch between the two.
I was involved in the colour pipeline for After Effects, where there is more colour management. It detects your display profile and compensates for that. I want to get Premiere to where After Effects and Photoshop are, but it will take time.
It’s a challenge because the typical user doesn’t know anything about colour management and doesn’t want to know about it. In After Effects, it’s all assumed that if you go into another colour space, you know what you’re doing. But the novice user might be up for a surprise.
We want to simplify and automate colour management without at the same time creating something clumsy for the professional. We’re still trying to figure out what is the best way to satisfy all users, which is hard.
Part of the technology and specs of Rec2020 and HDR is to work out how to ‘fit’ that gamut into specific devices that don’t actually display the full gamut. How does this work creatively when you’re grading something for different types of viewing environment at the same time?
Going from HDR to SDR isn’t easy to do without creative support because you need to know what the look is supposed to achieve, and what elements of the image we want to preserve. You might want to save the shadow details, or the highlights. Right now there is no artificial intelligence in the HDR to SDR converters that can understand the story and understand what’s important to preserve in the shot.
Similarly, if you go from a wide colour gamut to a smaller gamut, it could be a problem because you don’t want to lose the details in the saturated colours if they are important to you, but it all depends on the context of a specific shot, which requires creative choices.
These decisions are now part of the colourist’s job, and they need to do a 2nd pass for SDR and capture that as dynamic metadata. You want to put one HDR master out, and have the SDR grade embedded within. Part of my work at SMPTE is to figure out the way to represent that metadata. There’s at least 4 different proponents for different standards of HDR but the good thing is all of this can be done after the creative grading. You can create one HDR master and one SDR master and the metadata can be translated into each format.