Scene Referred Vs. Display Referred: Mastering Color In Blender

What is Scene Referred vs Display Referred?

In 3D computer graphics, there are two main color management workflows – scene referred and display referred. Understanding the difference between them is key to effectively managing color in applications like Blender.

A scene referred workflow aims to preserve the original color values of materials and lighting in a 3D scene. These colors are specified using high dynamic range (HDR) numeric values that match real-world physical light measurements. This allows you to accurately define scene colors based on realistic physical parameters.

In contrast, a display referred workflow encodes colors specifically for a target display device like a computer monitor or HDTV. The colors are adjusted to fit the lower dynamic range (LDR) and color gamut that the display can reproduce. This translation step is crucial for visually portraying scene colors properly on varied display devices.

So in summary:

  • Scene Referred: Original HDR colors that reflect real-world scene values
  • Display Referred: LDR colors encoded for reproduction on target display devices

A crucial aspect of color management is converting between these two color workflows effectively. Blender provides tools and workflows for working with color in both scene referred and display referred paradigms.

Configuring Color Management in Blender

Blender provides flexible color management via the Color Management section in the Scene properties. Here you can configure color spaces and transforms to implement both scene referred and display referred workflows in your projects.

For working scene referred, you define the scene color space that matches your HDR content creation pipeline. Common scene linear color spaces used are:

  • Filmic Log Encoding: For photorealistic rendering and compositing flexibility
  • Raw: For matching real-world scene radiance values
  • Linear RGB: Simple generic linear space

To translate your HDR colors for viewing on LDR displays, Blender applies the View Transform you specify – usually a custom filmic tonemap curve optimized for visual fidelity:

  • Filmic: Smooth contrast tonemap with wide dynamic range
  • False Color: Analyze pixel luminance values
  • Raw: Show actual color values for debugging

For final export and display referred deliverables, you choose suitable display device color spaces like sRGB or Rec.709. Proper display referred encoding is handled automatically by Blender utilizing color space conversions and transform graphs.

So in summary, configure your scene color science for scene referred workflows, and rely on Blender to handle technical display encoding tasks behind the scenes!

Scene Referred Workflow

Working in a scene referred color pipeline has many advantages for CG content creation. By preserving real-world scene radiance values throughout your pipeline, you maintain the highest level of visual fidelity and flexibility to render photoreal footage for any display scenario.

Using RAW Color Data

At the start of a show pipeline, capturing scene lighting and surface colors using raw uncompressed HDR measurements establishes the foundation for downstream photorealism. Using actual on-set reference photos captured in RAW formats provides the most accurate target data to match in CG.

Modeling real-world color phenomena like Rayleigh scattering in atmospheres requires unprocessed floating point data that hasn’t been display-encoded yet. The FILMICO color transform in Blender allows defining colors precisely using spectral wavelengths, radioactivity values, and Kelvin temperatures matching real phenomena.

MATLAB-style vector/matrix calculations can directly utilize scene linear color data for research and simulation purposes. Keeping lighting and colors in RAW scene linear space is critical for physically based rendering techniques simulating realistic light transport.

Examples of Scene Referred Values

To illustrate the use of RAW scene referred color data, here are some examples of HDR values used in different contexts:

  • Surface albedo: precise RGB reflectance spectra measured from real-world samples using spectrophotometers
  • Sunlight: spectral power distributions matching black body radiation at 5700K
  • Skylight: RGB values quantified in watts per steradian per megaparsec squared
  • Incandescence: light spectra emitted from tungsten filaments heated to 3100K
  • Fluorescence: energy emissions spawned specifically by electrons rejoining ionized atoms
  • Rayleigh scattering: mathematically calculated using the inverse 4th power of wavelengths
  • Fog density: measured in parts per million of light scattered by microscopic water droplets

As shown above, scene referred color definitions reference real-world radiometric or photometric phenomena. This level of visual accuracy sets the stage for CG assets that can integrate properly into captured photographic plates.

Display Referred Workflow

Encoding for Display Devices

For delivering final renders to audiences, the colors need to be encoded specifically for their viewing scenario – whether theatrical screens, TVs, mobile devices, VR headsets etc. This display referred workflow step adapts the content for each unique display type.

Mapping scene linear HDR values into a display’s LDR gamut requires “tone mapping” – mathematically remapping data from one color volume to another. Tone mapping operators (TMOs) transform high dynamic range content to fit into the capabilities of common standard dynamic range (SDR) display devices.

3D animation and VFX pipelines need to deal with a wide variety of target display specifications. Parameters like peak luminance, black point, primaries/whitepoint, electro-optical transfer functions (EOTFs) vary for different displays. Encoding content properly for each device is crucial for professional color managed workflows.

Examples of Display Referred Values

To give specific examples of display encoding parameters:

  • HDTVs: Rec. 709 primaries, gamma 2.4 EOTF, 100 nits peak brightness
  • Digital Cinema: P3 primaries, gamma 2.6 EOTF, 48 nits peak brightness
  • YouTube stream: Rec. 2020 container, gamma 2.2 OETF, 500 nits peak brightness
  • iPad Pro: DCI-P3 gamut, pure power 2.2 EOTF, 600 nits peak brightness
  • OLED TVs: Wide gamut, logarithmic calibration curve, 1500 nits peak brightness

As you can see, specing colors for each display device requires defining RGB component transformations to map broader camera-captured data into the display’s specific color reproduction Gamut. Getting these device specific parameters exactly right is key for professional color critical production pipelines.

Integrating Both Workflows

While scene referred and display referred workflows serve complementary purposes in production pipelines, you often need to integrate both colorspaces within the same scene.

Conversion Nodes

Blender’s node graph architecture offers tremendous flexibility for bridging scene linear and display encoded data via Color Space nodes:

  • Use Scene Linear to Display conversions for rendering cameras
  • Use Display to Scene Linear conversions for image textures and input mattes
  • Connect multiple branching conversion nodes for procedural color transforms

You can route different asset data through varied transform paths within the same composite tree for ultimate flexibility. Lambda math expressions added to conversion nodes allow complex custom color processing on the GPU.

Display Transforms

For final output rendering, the View Transform you configure in Blender’s color management stacks determines the tone and contrast mappings applied. This handles translating HDR scene values into visually pleasing LDR images.

Different display transforms suit different content types and viewing conditions. For example, the Filmic view transform tonemaps scene data with a smooth roll-off optimized for monitors. A pseudologistric curve could better suit HDR Dolby Vision grading for OLED TVs.

Using RAW or FILMICO Log encodings for scene linear spaces preserves the most downstream flexibility. You can then apply custom Display Transforms dynamically simply by swapping the View Transform at render time.

Best Practices

To leverage both scene referred and display workflows optimally, here are some best practices that prove effective in real-world production:

Matching Real-World Values

When defining scene content using physically based values, reference real-world physical constants and measurements as much as possible. Examples include using actual measured spectral data for material albedos, accurately geo-positioned Sun objects, and scientifically validated Rayleigh phase functions. Don’t tonemap or distort these linear values – preserve the accuracy within scene referred space for greatest realism.

Consistent Working Spaces

Standardize workflows around designated working color spaces for consistent results:

  • Scene Referred Working Space: FILMICO Log Encoding
  • Display Referred Working Space: Rec.709 Gamma 2.4

COLOR MANAGE ALL THE THINGS! Always activate color management and configure base color spaces properly in all scenes, images, and assets. Never leave data un-color-managed or mismatched.

Adopting pragmatic conventions for dealing with both scene linear and display encoded data makes production pipelines much more manageable. Codify specific transform handoff points between departments to minimize errors from color space mismatches.

Invest time to understand color science fundamentals like spectra, standards, encoding transforms, gamuts, and primaries. Properly handling color data at a technical level ultimately drives more creative freedom. Master color in Blender through both practical experience and academic study alike!

Leave a Reply

Your email address will not be published. Required fields are marked *