Research | Colour Management

Linearised, Scene Referred Workflow
July 5th, 2018

Scene Referred Colour, Linearised Colour, LOG

One of the core defining factors of professional workflows is colour management. As a professional colourist this is something that I am all too aware of when managing various colour spaces and scaling them accordingly to relevant display referred colour spaces.

 

To this end, for a VFX pipeline to be considered truly ‘professional’ it must be able to not only be photoreal in its interpretation of light and colour, but also be able to scale accurately and reliably into the predesignated standards for both acquisition, intermediate and display colour management. Before this project can truly explore the capabilities of Blender in regards to a professional VFX pipeline, I must first investigate their ability – or inability – to handle colour management.

  • Validity of Research 85%
  • Complexity of Content 95%
  • BENEFIT OF ACTIVITY 100%
Above is a an original render from my last project output, using the ‘Filmic’ Blender tone mapping option recently added to the Blender 2.79b build.

Secondary Research

BLENDER  GURU

Having completed two larger projects now using Blender I was noticing some strange anomalies in terms of clipping in the highlights and crushing black tones in the shadows of all my renders. I suspected that Blender seemed to have a limited dynamic range of EV (Exposure Value) Stops, similar to that of lower-end consumer/prosumer cameras. All of which, was mitigating the perceived ‘realism’ of the final renders and limiting its ability to be integrated into a RAW/LOG colour management workflow within industry level programmes such as Davinci Resolve.

 

Firstly, my aim was to define whether Blender had the ability to handle Scene referred or Linearized Colour space, similar to that of Nuke and Maya; the industry standard applications for CG/VFX assets creation and compositing.

 

Searching for the core terms I would typically associate with wide dynamic range and VFX – Scene Refered or Linearized colour space  – I found the following video from BlenderGuru.com

Core findings

Blender default is ‘limited’

Blender, sees just 8 Stops by default even though it works at a scene referred/linearized level prior to this clipping/crushing of dynamic range

‘filmic blender’ is the answer

Troy Sobotka, an industry professional and camera enthusiast who saw the problem in Blender and wanted to fix it. So he coded Filmic Blender – a similar color management configuration to ACES, which is the industry standard for VFX.

‘filmic blender’ has 25 stops

Filmic Blender not only gives the user 25 stops of dynamic range instead of the default 8 stops, but also provides accurate exposure desaturation. “This is to imitate an effect that’s unique to film: as exposure increases, colors will become more and more desaturated.”

false colour

Troy Sobotka has also privided a Fasle Colour visualisation that is used by both Colourists and Cinematographers to accurately ascerain luminance values as they are mapped across the screen; allocating a [partoucalr hue to each EV stop of dynamic range.

Secondary Research

PAUL CHAMBERS 3D 

Following on from the findings of the previous study, it was clear that Linearised, Scene Referred colour transforms/space were the only, industry standard way in which to reproduce ‘real world’ light and colour properties. Additionally, the video by Blender Guru also illustrated that the work of  Troy Sobotka and his ‘Filmic’ colour management allowed artists to bypass the antiquated sRGB default colour space of Blender Official release, and exploit the full Scene Referred colour management that has always been ‘under the hood’ of the software.

 

So, if my previous research was to define whether Blender could remedy the anomalies I was seeing in my previous projects regarding clipping of highlights and to define whether Blender has a scene referred colour space, then following this confirmation, the focus of my next search was to ascertain whether the Filmic colour management could be applied to a VFX workflow. Much of the research/tests shown in the previous post was focused purely on completely CG imagery, rather than a combination of Live Action and CG VFX.

 

Finding the work of Paul Chambers, it was clear that he was in the process of posing the same questions. A staunch advocate for Blender 3D wthin the professional VFX world, Paul was experimenting with Sobotka’s ‘Filmic’ colour management to better realise and manage ‘accurate’ colour and light in his work. The focus os his discussions centre less around Sobotka himself and more so the inspiration behind Sobotka’s work: the ACES colour managament workflow. Of particular interest to me was the video of Alex Fry’s keynote speech at Siggraph 2015, that discusses the colour management pipeline for “The Lego Movie”.

 

As a colourist I am more than familiar with the ACES colour management workflow for Live Action colour transforms and scaling, but this was the first time I had explicitly seen it applied to a CG pipeline, due to the film’s distinctive combination of CG, Stop MOtion and Live Action photography. This video alone has helped to clarify not only the typical workflow that can be applied using an ACES colour management pipeline using NUKE, Maya, Resolve, etc. but has more importantly helped to frame and contextualise the work by Sobotka in his ‘Filmic’ colour management for Blender.

Core findings

ACES can handle beyond visable spectrum

It seems that ACES is capable of storing data way beyond the 21 stops of perceivable dynamic range of the eye. Filmic Blender seems very similar to this

aces COULD INSPIRE MY APPROACH TO BLENDER

If I can gain some defined consistency to the format of the footage coming into and out of Blender – in a known floating point format (Linear .EXR) I could then manage the scaling to any format in the colour grading process.

ACES IS “STANDARDISED”

Even Though ACES was the basis of Troy Sobotka’s Filmic Blender colour space, Blender does not seem to explicitly support a full ACES pieline. That said, again I could take the predictability of the ACES workflow and create my own, predictable results by working in linear, EXR colour space.

Secondary Research

SHOOT>DATA>POST.COM 

 

Continuing my research thread into colour management, it seems that the ACES pipeline may offer a blueprint/solution to an effective and reliable colour management workflow for VFX integration inside Blender 3D.

 

A detailed and complex post on the industry, practice-based website shoot>data>post.com summarizes the complete ACES workflow used in Hollywood across a number of recent high-concept feature productions. This site not only has validity in its authorship but also has a number of prominent engineers, coloursists and VFX artists who also modify, update and validate the legitimacy of each post; the ACES one being no exception. To this end, this is a highly trustworthy and valid source of information regarding an ACES colour pipeline.

Core findings

ACES IS DEVELOPED BY (AMPAS)

aces WAS DEVELoPED TO SOLVE 2 PROBLEMS

 

ACES IS ‘LINEAR’

 

ACES CAN ENCODE ENTIRE ‘VISABLE SPECTRUM’ (30 STOPS)

 

primary Research

TRYING ‘FILMIC’ BLENDER 

 

In order to assess the relative impact and the possible scope of workflow issues inside Blender filmic, I simply took an existing CG composite generated inside Blender in my last project and changed its colour management to ‘Filmic’.

 

If you compare the BEFORE (original composite) and AFTER (Filmic applied) images in the exemplar image slider, there is a noticeable change in the quality in the highlights and shadows. Shadow details are preserved and the highlight details have a more refined and naturalistic fall-off. Additionally, there is a notable change in saturation degradation when luminance values increase; another example of a physically accurate interpretation of realworld values.

This said, it seems that the colour management inside Blender is applied at the end of the compositing chain. To this end, the background plate – which has a Display referred, 8 bit colour space baked ithin it – is negatively impacted by the same ‘Filmic’ adjustment. To this end, the accurate Linearised CG elements are correct, but the footage becomes logarithmic in value. In the absence of any colour management nodes in the Blender compositor, there is not way to handle the Background footage separately to the CG elements. To this end, the only logical option is to attempt to Linearise the background footage prior to import to Blender.

Core findings

‘filmic’ makes the CG ‘linear’

The CG inside the render is clearly linear IF YOU compare values

‘Filmic’ removes clipping/crushing

As a result the shadows and highlights in the Filmic Render have little to know cli[pping or crushing eventhoguh the lighting values have remained the same. 

CG linear/Footage ‘display reffered’

Filmic Blender not only gives the user 25 stops of dynamic range instead of the default 8 stops, but also provides accurate exposure desaturation. “This is to imitate an effect that’s unique to film: as exposure increases, colors will become more and more desaturated.”

blender compositor offers ‘no’ colour space management

Background plate footage will need to be linearised prior to import into Blender, as there is no option currently yo manage colour space of footage seperately to CG elements inside the Blender compositor. It seems that the ‘Filmic’ colour management – as is all Blender colour management – is applied at the end of the chain. To this end, footage will need to be the same Linear format as CG; with Filmic added to both.

The Curious Engine

My Professional Portfolio
Go There Now

Recent Posts

Here are some recent posts that I have written as part of the Digital Media Practice 3 project – ‘FUSE’