Visualisation Series: Chroma


Visualisation -- digitally representing the appearance of a product -- is a key part of ecommerce. Various businesses in industries from fashion to automotive are looking for innovative technology to improve the customer visualisation experience to increase customer engagement and sales.

Many retailers in the Kitchens, Bedrooms, and Bathrooms (KBB) sector, specifically, offer a large range of customisable products, but don’t hold them all in-store. This means customers cannot see a product until it arrives in their home, creating a huge challenge for retailers as customers will hesitate at making a purchase or simply not make a purchase at all. It’s difficult to persuade a customer to make a significant purchasing decision if they can’t picture how a product will look.

Visualisation solves this problem. Realism is key to providing the confidence a customer needs to make that purchase, with 50% of people more likely to shop with a brand offering visualisation technology.

Enter Chroma.

What is Chroma?

Chroma is our proprietary rendering library aimed at producing non-real-time, photo-realistic images. It’s designed to give users a far more immersive view of their custom-made space, allowing them not only to be confident of how their products will actually look when installed, but also to give a better indication of how the size of the space will feel when used.

It also uses all of the same 3D assets as our real-time design tool, allowing users to generate Chroma images in a shorter amount of time than any other piece of software in this area, using exactly the same designs they have already created.

Chroma is currently live with a major European retailer and generates over 2000 images per month. Our technology is helping reduce the time it takes to design and purchase a new bathroom from 6 months to 2.5 months.

 A Chroma generated image.

A Chroma generated image.

A Customisable Cloud-Based Solution

Chroma has been designed to provide a customisable solution. The quality of a rendered image, and the amount of time it takes to render an image, can be adjusted on a per-render basis. This enables retailers to provide an experience where Chroma renders can be generated within ~15 seconds, to view iterative changes, or be used to provide a truly immersive view with render times closer to ~10 minutes.

There are also a variety of other customisable features such as camera projection and visual effects that allow Chroma to generate images for different purposes such as marketing or product catalogue pages.

A single instance of Chroma in the cloud can be used for a variety of purposes, reducing server and hosting costs for us and ultimately the retailer. For more information on how we manage the cloud side of Chroma, take a look at the latest blog from Mike, our Head of Dev Ops.

 An orthographically projected image.

An orthographically projected image.

 An equirectangular or ‘360 panorama’ image.

An equirectangular or ‘360 panorama’ image.

The Tech

Chroma is a C++ library built using Nvidia Optix. In writing this blog post I managed to dig out the original technology research that I carried out when deciding what software to build Chroma with, Optix was the clear choice.

In a nutshell, Nvidia Optix is a framework that enables GPU-accelerated ray-tracing. It does so by allowing the user to create CUDA programs that are executed at certain points within the ray-tracing pipeline, similar to how shaders work within OpenGL. This approach gives the developer a great amount of flexibility around the shading algorithms used to render a scene, while allowing Optix to handle all of the complexities of scene structure and traversal.

 A diagram of the programmable points in the Optix pipeline. Source:  https://devblogs.nvidia.com/nvidia-optix-ray-tracing-powered-rtx/

A diagram of the programmable points in the Optix pipeline. Source: https://devblogs.nvidia.com/nvidia-optix-ray-tracing-powered-rtx/

Chroma has two distinct methods for shading a scene. I won’t explain them in full detail as Monte Carlo sampling is a very large and complex topic, but here is a brief overview of our two shading algorithms:

  • Ray-tracer. The ray-trace route is the least complex of the two. Rays are fired from a camera through the pixels of the resulting image; intersections with geometry can then spawn additional rays for reflection and refraction. Diffuse lighting, however,  is calculated using standard PBR approximations rather than sampling additional rays. This method lacks global illumination but generates a high-quality image in a shorter amount of time.

  • Path-tracer. This route is a traditional, unbiased Monte Carlo method. Rays are fired from the camera as above. When intersecting with a surface, irradiance is computed for the intersection point and the current ray is reflected in some direction, creating a ‘path’. A weighting is also calculated at each intersection that determines how much radiance of the current path contributes to the final radiance of the pixel. The path continues to be traced until the weight becomes insignificant and is terminated. The resulting image is very high quality with full global illumination, but takes longer to converge.

As with all software, Chroma is still a work in progress. We’re currently working on adding better environment lighting. This will enable us to render outside environments and add another level of realism to scenes with windows and patio doors.


If you have any questions about Chroma or DigitalBridge, get in touch with the team on info@digitalbridge.eu.

Ashleigh Thomas