If you are working in post, you have most likely been hearing more buzz about color management in the last few years. Whether it is delivering assets for Netflix or simply preparing projects for the future, color management is undoubtedly part of the modern post-production workflow.
To help support our users, Boris FX recently teamed up with noted author and TV/film industry veteran, Steven Katz to help explore and demystify some of the common questions and issues surrounding color managed workflows.
Please enjoy this as the first, in an ongoing series of special guest articles. — The team at Boris FX
ACES, OCIO… What’s the Deal?
You’ve probably heard about ACES and OCIO. (ACES is the Academy Color Encoding System; OCIO stands for OpenColorIO.) Both color management initiatives emerged over the past decade, but now are gaining traction as need-to-know technologies. Both systems are business-minded responses to the unending profusion of formats that have appeared since digital replaced almost all film acquisition in movies and TV.
What’s the big deal? In the older days of analog film and television, there were few format choices to actually make in the overall post-production and delivery process beyond film stocks and tape formats. However, in today’s digital landscape there are more file formats, intermediate processes, codecs, and applications — all viewed on a myriad of devices and displays. Most camera companies have their own proprietary RAW file formats and almost all projects incorporate effects, graphics, legacy formats or stock footage… that all have their own color space. You get the idea. There is a lot of shared data being worked on, but not enough common definitions to ensure that what an editor sees on the monitor in their edit suite, is exactly the same through visual effects vendors, color grading, and broadcast.
ACES and OCIO were developed to solve issues for major studios (and now the streaming services) which are investing massive amounts in new content. If you’re an editor, compositor or a VFX artist, understanding color management is increasingly a necessary part of your job. Industry-wide initiatives like ACES and OCIO attempt to make color management straightforward and uniform across software applications, displays, and facilities.
Color theory is obviously a deep topic. Here is a quick primer on some basic color concepts to help you fully understand logarithmic and linear spaces.
- Cameras measure light in f/stops or just stops.
- Humans can see about 10 stops in daylight.
- But in shadow or darkness humans can see 20 stops of light from black to white.
- Daylight covers a range of brightness wider than humans can see.
- Cameras only record a portion of what humans can see.
- To capture the full range of a daylight scene (dark to light) we have to compress the light to fit into the camera’s dynamic range.
- Light in the real world is linear.
- If you shine a light on a table and then double the light and measure that with a light meter it will show 2X the light intensity. But human vision/perception doesn’t work that way.
- Humans have more light sensitivity in darkness than in bright light.
- This is probably evolutionary. 100,000 years ago having better night vision helped humans not become food.
- This relationship of perception and light intensity is non-linear. Our eyes see light logarithmically as the graph below shows.
- In production, we work with both linear and log video files.
- Cameras record in linear encoding and this is called scene-referred linear.
- Display devices such as monitors and projectors are typically logarithmic to match how humans perceive light. These are called display-referred log.
- Both ACES and OCIO provide LUTS and transforms.
- ACES adds new standardized color spaces.
When you bring Camera RAW into an application like Adobe Lightroom the RAW (linear) data is converted to Log when imported. This conversion is at the core of color management for film and video.
In film/video production this conversion is performed many times. Color conversion can work both ways:
Linear to Log OR Log to Linear depending on the operation such as VFX or color correction.
Colorists and VFX artists often need to work in linear color space, but can’t make artistic decisions in the flat looking image. So a LUT temporarily makes things look correct. Under the hood the file is still linear. After work is performed the linear file has the creative changes, but requires the LUT to view it properly. When applied correctly, the conversions preserve all the data.
Below is a simplified version of a studio pipeline. In reality there would be more blocks and intricate relationships, but this serves the purpose of showing why color management is complex. For example, editorial could include several display monitors and a screening room each requiring color conversion.
Now consider what happens when there are three VFX studios around the world working on a show each with their own color pipeline. That means more conversions and the opportunity for errors.
Between each node, a color conversion is applied with a LUT or a transform. In some cases, a file can be converted to a color space, worked on and then transformed back to the original incoming color space.
Just as log and linear processing of light takes the human visual system into consideration, the same is true for color. Humans see a range of energy called the visual spectrum. Below is the electromagnetic spectrum that includes x-ray, gamma rays, radio waves, and the visual spectrum between approximately 380 to 740 nanometers or ultraviolet to infrared.
In 1931, the CIE (Commission internationale de l’éclairage) created a color matching system.
It has been the global standard for colorimetry since then with a few important updates. The following is a condensed version but the main thing to understand is the CIE Chromaticity Diagram below.
The outer boundary is the visual spectrum as if we bent the electromagnetic spectrum into a horseshoe shape. Colors along that line are the most saturated colors. The colors within the boundary are the colors human’s perceive.
The CIE diagram is defined by RGB colors based on Tristimulus Color Theory which basically states that all colors in the visual spectrum can be created with just three colors, red, green, and blue.
The black curved line towards the bottom of the diagram is the Planckian Locus and is used to indicate color temperature in degrees Kelvin. When R, G, and B have equal energy, the color is white.
The CIE diagram is a 2D rendition of a 3D color cube.
Here’s a look at color spaces you’re probably familiar with projected on the CIE diagram. The triangles are color spaces formed by red, green, and blue primaries at the vertices. They are subsets of the full visual spectrum meaning colors are left out if they don’t lie within the triangle. Color spaces can be defined by anyone and many companies do. Establishing a standard is much more difficult.
Rec. 2020 is a recommended color space for Ultra high-definition television. It has a much wider gamut than color spaces in use. The wide gamut was chosen in the interest of future proofing.
Rec. 709 is the official standard for high-definition television HDTV.
DCI-P3 is the standard for digital projection for theatrical exhibition.
Standards are set by SMPTE
We’re almost ready to look at ACES and OpenColorIO. First, some things you can do to check out color management in software you already own. Resolve, After Effects, Premiere, and NUKE all have the ability to read LUTs. Not a bad way to see how it may affect your work now or in the future.
You might also look in Photoshop at the Edit>Color settings. A list of color spaces (including Rec 709) are there to load. Your operating system, Mac or Windows, will also have color profiles you can load thereby changing the way your monitor displays colors, essentially a LUT.
On the Mac, the Colorsync tool lets you compare 3D color spaces.
OCIO was first developed in Vancouver by Sony Pictures Imageworks. Originally initiated by SPI Imaging Supervisor Jeremy Selan, the first version was released to the public in 2009. Reflecting its specific development circumstances, OCIO doesn’t delve into camera output or archiving, but rather the needs of a facility pipeline. OCIO is really aimed at coders, the pipeline programmers who are at the heart of today’s visual effects workflows. The main contribution of OCIO are the free libraries of C++ and Python scripts for the most common hardware/software used at any studio to create custom LUTs and transforms as needed. According to Selan in his useful paper Cinematic Color:
“OCIO color configuration files define all of the conversions that may be used. For example, if you are using a particular camera’s color space, one would define the conversion from the camera’s color encoding to scene-linear. You can also specify the display transforms (for multiple displays) in a similar manner. OCIO transforms can rely on a variety of built-in building-blocks, including all common math operations and the majority of common lookup table formats. OCIO also has full support for both CPU and GPU pathways, in addition to full support for CDLs and per-shot looks.”
Understanding that can be a little daunting at first, but we’ll be diving into more details soon enough.
The first version of ACES was developed under the Academy of Motion Picture Arts and Sciences, released in December 2014, and approved by SMPTE (Society of Motion Picture and Television Engineers). Post facilities, VFX boutiques, and even freelance editors and artists are likely to have to deal with ACES files in the future.
ACES was originally created for feature film, visual effects, and animation workflows and has been adopted by many established facilities working at the enterprise level. But with Netflix adoption and proliferation of remote production and outsourcing, ACES can be relevant for all levels of post-production. Anyone who is feeding shots to VFX and post-production facilities will want to know about it to future-proof their jobs.
Differences and Similarities
The goal of both ACES and OCIO systems are straightforward: Provide consistency throughout post operations and preserve data during the production process at the highest possible quality. From the first conversions of camera RAW files to the working format(s) of choice, or if handling VFX, rendered files to a viewable format, this will happen many times during a production. A complex color management workflow becomes easier by providing a library of transforms and LUTs for the major display devices and grading software.
ACES and OCIO are both highly-configurable color management systems, but there are differences. Is one better than the other? Not really. It’s more about how a studio or production team wants to characterize the project. ACES looks for industry-wide compliance based on color spaces standardized by SMPTE. Meanwhile, OCIO focuses more narrowly on providing tools to pipeline developers for whatever color space they choose to use. Thankfully it’s not one or the other. Hybrid ACES and OCIO pipelines are supported by both systems.
ACES COLOR SPACES
ACES color spaces have been standardized by SMPTE. SMPTE’s engineers and academics have kept watching over standards since the beginning of film and television eras, and now includes the digital realm.
But what makes a good color space? In simple terms, more colors, greater dynamic range, and future proofing so that the monitor you buy in 10 years with a greatly expanded color space will correctly display the color correction you’ve created today. The key here is AP0 (that’s an AP with a zero). This is ACES master color space (or gamut) and the hub of the system. AP0 was created as a way to wrangle the many color spaces from camera manufacturers such as Arri, RED, and Blackmagic Design, to preserve the look of each camera in a common color space. Since it is a format that manufacturers have signed on to support, AP0 is vendor neutral for interoperability.
Taking an ACES approach means you will transform the file from proprietary camera formats into the AP0 file format that offers wide gamut and is HDR ready. AP0 is stored as an OpenEXR format which supports up to 30 stops of dynamic range. This preserves all the data from existing camera RAW files and becomes the hub of all the color manipulation for every department on a show.
ACES has five color spaces: Two storage spaces AP0 and AP1 (one linear and one log encoded) and three working spaces: ACEScc, ACEScct, and ACEScg. The working spaces are designed for color correction and VFX. ACES approves all the transforms and LUTs needed to convert between the color spaces supplied by device manufacturers.
The diagram below helps show these relationships:
For a better sense of what’s happening in the workplace, we asked some prominent facility managers, artists and colorists what color management system they’re using now. If none, what do they plan to use in the future, ACES or OCIO?
What we learned is that the industry is transitioning from a time where everyone pretty much did what they thought best on a project-by-project basis into more defined methods to address color management. For large studios with color scientists and programmers on staff, ACES has become the accepted way to future-proof their investments.
Of course, making ACES succeed is not a just matter of convincing digital imaging techs on set or VFX artists of its benefits. ACES is a major industry-wide initiative It has taken time for various equipment makers to add support.
Dado Valentic, Award-winning colorist, color scientist and CEO of Color Intelligence
Valentic recommends OCIO and its implementation of ACES.
“ACES’ recognized that every camera manufacturer has their special sauce in how they interpret data of the sensor and that’s proprietary. We’ve seen this happen with still cameras. Everyone has their own version of RAW. But if we bring this variety of color spaces and gamma curves into a unified space, we can simplify the color grading process. So ACES created a very large, robust color space — ACES AP0.”
Linear AP0 can also be converted to one of ACES three working spaces tailored to color correction or VFX (ACEScc, ACEScct, ACEScg). You can output to various deliverables for Cinema, SDR, HDR or back to linear ACES AP0 to archive. Dado also points out some current limitations:
“I have cinematographers who can’t get the look they want in ACES. The most significant advantage of ACES is also its biggest drawback.
To make ACES work, they had to standardize the RRT (Reference rendering transform — one part of a transform, a math operation to convert one type of data into another). They have to use it because that’s how they get all the different color grading or VFX applications to get the same result.
But the RRT has a ‘look.’ Going from Alexa Log C to Rec. 709 is going to look different than Alexa to ACES because it’s using a different transform. The cinematographer says, ‘That’s not how I shot the scene. It looks different.’ And if they are happy with the ACES look, there’s no problem. But what if that’s not the case?”
Dado also points out that there’s another issue that isn’t usually talked about.
“We needed ACES Log because we needed something in 10-bit. Everything in the post-production environment is limited to 10-bit. The monitors the projectors and SDI cables are 10-bit. Video cards are 10-bit. 10-bit is a bottleneck. This is why we had to find a way to make ACES 16-bit linear more practical and why we had to convert it to 10-bit log variants like ACEScct. Now that Boris FX has implemented OCIO, it’s now possible to use plug-ins in all ACES projects.”
Chris Haney, VFX Artist, Software Hacker, Owner of OffWorld VFX
Haney recently served as VFX Supervisor on Dark Waters directed by Todd Haynes. He is regarded as a pipeline technician with a considerable understanding of color management. Like most boutique facilities, OffWorld often provides shots to larger studios and must adapt to their color management requirements.
“Color management systems for photo realistic film compositing have different requirements than those for print and even broadcast. When evaluating a particular piece of software’s color management I look for specific capabilities.
Does it allow for Scene Referred Linear Compositing with float values? This color space is best for photorealistic compositing and accurate reconstruction of 3D renders.
Does it allow other operations to be done in the proper color space? Not all operations are intended to be done in Scene Referred Linear. For example, many final color looks are intended to be applied to a LOG image.
Can it handle conversions for all incoming and outgoing formats robustly (Alexa C Log, Cineon Log with all of the original parameters, S-Log, Rec 709, etc.)? In order to do linear compositing, one needs to convert all inputs to linear and all outputs to the client’s preferred format, which is often Rec709, Cineon Log, or Alexa C Log in my world.
How do ACES and OCIO stack up to Haney’s requirements? Quite well. The main ACES color space is scene-referred linear. (Camera sensors, for example, “see” in a linear manner; twice the light makes the image twice as bright.) Lighting in the real world might work in a linear fashion, but Log space is a better approximation to the way we see and most display devices work.
Haney’s second requirement: Making sure operations, such as applying a color filter, are always done in the appropriate color space, linear or log.
“Operations that work well in linear space are 3D rendering, photorealistic compositing transparencies (fog, glass), optical effects (lens blur, motion blur, lens flares), and transforms with anti-alias filters, sub-pixel positioning, etc.
Operations that can work well in log space are things like color correction, levels and histogram adjustment, tracking and stabilizing, grain, some keying and simple paint and roto.”
Finally, Haney looks to keep track of conversions as you move from one space to another. “Concise file management is the true core of ACES and OCIO. Both provide transforms and LUTs for most of the major device manufacturers. If you feel comfortable with some simple programming, you can even create your own LUTs and transforms.”
John Bair, Owner, Visual Effects Supervisor, Phosphene
Phosphene is an award-winning New York-based design and visual effects studio, whose projects include Motherless Brooklyn, The Deuce, The Ballad of Buster Scruggs, and Escape at Dannemora (read how the Emmy-nominated team used Mocha Pro). The company relies on Foundry’s Nuke compositing system to manage its OCIO workflow.
“As a VFX studio we are almost always operating within the color workflow established by production and the post facility for any given project. Most of the time this means processing, viewing, and building editorial LUTs along with color information through our OCIO pipeline. Sometimes, however, ACES is the requested specification from a client.”
Mike Nuget, Freelance Colorist
New York-based colorist Mike Nuget uses DaVinci Resolve for grading documentaries and TV shows. Resolve facilities can work in an ACES workflow using its native color management although it is not yet an official ACES certified host. Nuget believes that with so many technical advances and changes it’s up to the colorist to understand color management.
“Nowadays every colorist should know as much as they can about color spaces. Just about every streaming service now requires certain color spaces to work in, so you kind of have to know about it. It’s also good to know so you can take advantage of all of the different camera and file types better. Certain sensors are purposely made to do things to the image. If you work in the wrong color space, you could be losing information.”
Autodesk Flame and Blackmagic DaVinci Resolve work in an ACES managed environment and include native color management tools.
Adobe Premiere Pro and After Effects include native color management tools and do not support ACES. However, Adobe users can turn to third-party tools to load OCIO profiles.
Avid Media Composer is now a full ACES partner and dedicated visual effects applications like Foundry’s Nuke and Boris FX Silhouette have supported the ability to load the various color profiles for many years.
Nick Crist, Compositing Supervisor, Phosphene
“Nuke tends to be pretty clear with both OCIO or ACES. However, I find some of the ACES integration and switching between ACEs and OCIO within Nuke to be cumbersome – bordering on problematic. I’ve found issues in the past when working on one project in ACEs vs another in OCIO and having to switch between projects with different workflows became tedious because the settings were saved at the preference level and not within the shot level. That may have changed in newer versions of Nuke but last I checked it was a problem.”
One issue in adoption of ACES and OCIO is that facilities have to weigh the advantages of the new systems against the existing (and working) color management they currently use. OCIO and particularly ACES were designed for large studios that have complex production pipelines and work with outsource vendors around the world. These are studios that typically have dedicated imaging technicians or color scientists on staff.
It can be unrealistic to expect this kind of pipeline upgrade to be completely without bumps along the road, especially at smaller studios that rely on a core staff of full-time artists mixed with freelancers who have to be brought up to speed on file management and workflows. While there is a tendency to be critical of ACES, OCIO or any other new system, human error is a major factor in the perception of issues. Evidence of this is on the forums at ACES Central, where some problems often turn out to be implementation errors.
Nick Christ (Phosphene) has used both OCIO and ACES tells us: “It needs to be simpler to use across multiple software packages. When you have so many different versions to choose from within the software and unclear information coming from the client, it just seems like it takes a lot of detective work to get everything working properly.”
One aspect of ACES that facilities eagerly await is the metadata that accompanies ACES files. It was under development as late as the end of 2018. In March 2019, ACES released the ACESclip specification. Color scientist, Walter Arrighetti (ACES), describes the clip-level metadata file as such:
“The ACES clip-level metadata file (“ACESclip”) is a ‘sidecar’ XML file intended to assist in configuring ACES viewing pipelines and to enable portability of ACES transforms in production. An ACESclip file describes the transforms necessary to configure an ACES viewing pipeline for a moving-picture image sequence.”
This is a step forward in organizing and archiving show data. Consider what happens when a show made in 2012 releases a director’s cut in 2020. A facility has to reconstruct the movie including new material and perhaps a re-edit. What if the director also wants to modify the look of the delivered show? This would require all the transforms and looks that were layered into the final archived show. Normally, reconstructing the show would be a nightmare, but ACES metadata will now keep a record of the creation process.
Bair sees this as an important development but would like even more metadata:
“Most of the time we are delivered EXRs which have been converted from the RAW camera files by a post-processing facility. Thankfully it is becoming more common for all metadata to freely pass through the transcoding process. While it is not quite there yet, we are on the verge of having a thorough and complex set of metadata baked into every camera source. It will be incredibly helpful for us, as a VFX studio, to have all color information, lensing, camera translation, etc., always available in our source plates.”
How or when camera information (such as lens focal length) will be included in ACES or other systems is currently unknown, as it is not covered in ACES or OCIO documentation. Still, ACES is building out a system in which archived show files are recoverable and repeatable now and in the future.
ACES is addressing issues quickly and most people are invested in its success as is clear from their long list of officially-certificated industry partners.
Again, user error and lack of documentation was a common theme in our interviews. In new complex systems, people make mistakes.
Haney points out:
“At the end of the day, we are making art and are required to deliver it to the client in the color space they have defined. If the smartest programmers in the world have a system that is technically correct but leads the average artist to output images with wrong gamma or color, then I consider that system to be flawed and a liability to my business. Adobe products can be difficult in this way. Some operations/effects do not work in float. Adobe Photoshop’s CMS was built for print and does not lend itself to film. Adobe After Effects’ CMS can also lead to output problems. We have had better success with Nuke and the open source OCIO for After Effects for select operations in AE.”
Crist also sees user error as high on the list of challenges in his job:
“Sometimes someone upstream can make a mistake. I’ve had plates delivered at the wrong color space, incorrect LUTs, andCCC files break. It’s hard to say when or where the errors occur until you really test each and every file bit by bit. Color still seems to very much be a manual process that requires a really keen and technical eye to oversee and I don’t think ACES has really fixed that yet.”
Nuget’s experience with clients is mixed:
“I think a lot of the bigger productions know more about color in general. I work in docs a lot, and to be honest, either this never even comes up, or they just rely on me to make the decisions and create the workflow environment. From what I’ve seen, the VFX industry is more aware and capable of working in a color-managed environment, smaller shops, not so much.”
Constant and fast-moving change is a certainty in the digital era. Cameras and devices continue to advance. HDR is now a standard feature on most displays and new monitors may soon have a fourth and fifth color primaries. The dynamic range of cameras will increase and bring more factors to color management. ACES and OCIO offer rational solutions to industry chaos caused by innovations. So, yes, you should look into these initiatives even if you are currently not working on color-managed projects. The future of moving pictures will be brighter because of today’s innovations, even with the potential learning curve to get there.
1. Selan, Jeremy, Cinematic Color (SIGGRAPH ’12 ACM SIGGRAPH 2012, 2012)
Special thanks to:
Rod Bogart, Vice Chair ACES
Sean Cooper, OCIO and board member of ACES
About the Author
S.D. Katz is a writer/director/producer working in film and television. His work has appeared on Saturday Night Live and in film festivals around the world. He is also the author of the best-selling books on film design and directing, Shot by Shot and Cinematic Motion.
Learn more at shotbyshotbook.com
Color Management Glossary
RAW is an image format that stores data from a digital camera sensor with minimal processing. A RAW file is uncompressed and not viewable without conversion to another format. Each camera, scanner manufacturer develops their own RAW format. There are over 100 RAW files in use. Most RAW formats use linear encoding.
Blackmagic Design’s RAW format (BRAW) with light compression is used in all BlackMagic Design cameras and in their color grading software DaVinci Resolve. The format is a non-linear 12-bit color space and is smaller than most other RAW formats.
Adobe introduced its own RAW video format in 2008 with the goal of industry-wide adoption. However, it has not sought standardization from SMPTE or the ISO.
Red, green, and blue make up the three primary colors. In an additive color system such as slide film, projection, and colored lights, the colors produce white light when mixed in equal proportion. In any other proportion, they mix to create all the colors humans can seen.
Red, Green, Blue. The primary colors in an additive color system.
OpenColor In Out. OCIO is an open source color management system developed at Sony Imageworks and released to the public in 2009. The core of OCIO is the library of LUTs and transforms used to convert color spaces and how they are represented on display devices.
Academy Color Encoding System. A color management framework consisting of new color spaces standardized by SMPTE. LUTs and transforms convert common display and camera devices through delivery and archiving.
Academy of Motion Pictures Arts and Sciences. The honorary professional organization of the motion picture industry. AMPAS is best known for the yearly Oscars awards show. Founded in 1927, headquartered in Beverly Hills, California.
Society of Motion Picture and Television Engineers. Founded in 1916, SMPTE is a global professional organization of engineers, scientists, and executives working in motion pictures and television. SMPTE’s main function is producing standards, but it also provides educational and academic activities and publishes the SMPTE Motion Imaging Journal.
Is a widely used general-purpose programming language, an extension of the C programming language first released in 1985.
A programming language in wide use in computer graphics because of its ease of use. Many graphic software packages allow Python scripting to modify the interface and other aspects of the program.
An acronym that represents Look Up Tables. A table of RGB input and output values to convert one color space to another. LUTs can be 1D and 3D.
16-bit (half float)
The level of mathematical precision that represents pixel color in computer graphic images. Photoshop offers 8-bit, 16-bit, and 32-bit color. Floating point values can represent far greater dynamic ranges.
Encoding that represents the light in the real world. Digital cameras encode this way with the option to output linear files in the RAW format. Most camera manufacturers develop their own version of RAW.
Encoding that emulates the logarithmic way display devices and human perception process color and dynamic range.
A highly robust image format developed and released (1999) by ILM. It uses 16-bit half float values and can record up to 30 stops of dynamic range.
The range of dark to light values captured or perceived in photography and human vision. Photographic dynamic range is typically measured in camera f-stops.
CIE 1931 Color Space
The International Commission on Illumination (commonly known as CIE from its French name, Commission internationale de l’éclairage) is an organization founded in 1931 that creates international standards related to light and color. CIE 1931 color spaces were the first defined quantitative links between distributions of wavelengths in the electromagnetic visible spectrum, and physiologically perceived colors in human color vision. (WIKI)
Human Visual Locus
The full range of color vision and perception as measured on the electromagnetic spectrum (approximately 380 nanometers to 760 nanometers from ultraviolet to infrared).
The complete range of colors in a color space defined by the location of the RGB primaries.
Standard Observer Spectral Locus
The 1931 CIE color proposed international standards based on human perception. However, color vision varies slightly between subjects with “normal” vision. CIE tested two groups of subjects in tests to establish an average baseline for color perception.
Color Transform Language (CTL)
A programming language for digital motion picture digital color management systems created by AMPAS.