Visualisation Series: Accurate Material Representation
In the new, booming world of augmented, virtual and mixed reality, developers and businesses are enabling people to create immersive experiences quickly without a background in computer science or programming with products such as Amazon Sumerian. This is fantastic as it enables people to play with the technology and become invested with the whole mixed-reality concept. However for businesses like DigitalBridge, attempting to push the bounds of this technology and create a truly unique and rich experience, there is a part of the software stack which is often neglected but essential to these aims; we call this visualisation.
Visualisation is a loosely defined term, but what we’re specifically talking about here is rendering. At DigitalBridge we take visualisation seriously as we believe it to be a big part of our technology and it’s what really sets us apart from our competitors. To the video-game world rendering is nothing new, and in fact games have been the principal driving force of rendering for the past few decades, but as we emerge into this new world of immersive reality technology, the development of that technology is becoming a new major contributor to visualisation. This is our first of a series of technical posts detailing different aspects of visualisation, including how we achieve the look we want and the challenges we face.
We assume the reader has some sort of grounding in basic rendering and lighting theory. If you don’t know the difference between diffuse and specular light you might want to do some background reading first.
In this post we’re going to talk about materials; the set of physical properties that determine how a particular surface interacts with light and how we differentiate between materials such as steel and carpet. In a perfect world we’d program a completely accurate physical model of light transfer, but this would take days to render a single frame on a super-computer, so instead we use equations that give us a decent approximation of light suitable for real-time interactive applications. One of the most widely used approximations that has emerged in recent years is known as physically-based rendering (PBR), and we’ll outline how we implement it below.
At DigitalBridge we currently have two main rendering engines;
- The inventively-named “DigitalBridge SDK”: our main cross-platform renderer that we use in our web based real-time visualisation tool
- Chroma: our non-real-time ray-tracer we use to render higher quality 360º panorama images.
Having multiple engines introduces the challenge of ensuring that materials are represented accurately across our entire rendering suite. We also need to ensure that different objects with the same material (such as a chrome tap or radiator) are rendered consistently throughout a scene.
To solve this problem we created a single material representation structure that contains values for properties such as diffuse colour, glossiness, index of refraction etc., and we use this universally across all of our engines. This means a material such as steel has one single definition (created by our art team), which is used to render all steel items across all engines.
There are always going to be some visual differences between the two renderers. Reflective surfaces in Chroma will appear brighter as they receive light contribution from the scene for example, this can be seen especially with the tiles in the above images. Overall though we’re happy with the material consistency, and reflections in the DigitalBridge SDK are something we’re looking to improve in the future.
The PBR model is based on the idea that in reality no object is perfectly smooth: all surfaces have tiny imperfections we can’t see but which affect how light interacts with those objects. It would be too expensive to model all of these ‘microsurfaces’ but PBR gives an acceptable approximation of them. For a more in-depth explanation of PBR theory I’d suggest reading this Marmoset article: https://www.marmoset.co/posts/basic-theory-of-physically-based-rendering
Our main decisions with PBR were what distribution and geometry functions to use. After experimenting with some we decided that a combination of GGX distribution and the Schlick approximation GGX Smith geometry worked best for us, this is a very common combination used by most major engines.
Brian Karis of Epic Games has a fantastic collection of the different equations in his blog. John Hable also has a great post about optimising GGX for shaders.
We also chose to implement the specular workflow as opposed to metalness. We aren’t particularly fussed about the extra texture space that is saved with metalness, and the specular workflow allows us to represent things in a slightly more physically accurate manner.
Realistic Reflections in Chroma
PBR is a pretty simple way to ensure specular highlights have a nice amount of realism to them. This is very important in the DigitalBridge SDK because we don’t do refraction or environment-mapped reflections as we find them too expensive in a browser (geometry shaders please WebGL!), so the main visual cue of how ‘shiny’ something looks is the specular lighting term.
With Chroma we have a bit more freedom. While Chroma still has a time limit for its renders (we aim for < 30 secs), we can fit in a lot more realistic light calculation in this time. It’s also a ray tracer which massively simplifies tasks such as reflection and refraction. However even Chroma has to contain approximations; ideally we would like to send multiple reflection rays in different directions based on roughness of the reflected surface with some sort of importance sampling, but this would be far too expensive for our lower-quality 30 second limit. Another potential solution might be to have very finely-detailed normal maps applied to highly specular rough surfaces, but in practice the normal maps are far too high frequency and produce too much noise, or would require unrealistic texture sizes.
Our preferred approach would be to randomise the reflection normal by an amount determined by the surface roughness. However our naive test implementation of this method was inefficient, although there is scope to potentially implement something similar to this in the future.
Instead we perform single-ray world reflection and use the product of roughness with the reflection ray result to give a very simple approximation of rough reflection. We also include a Fresnel term to increase reflectivity at grazing angles.
What we get is an acceptable approximation of specular reflectivity. The top images all have a linear specular value of 0.567 and no diffuse, representing a metal similar to iron. The bottom image showcases Fresnel on a high-roughness low-specular surface.
Ensure Realistic Material Values
One last but very important point when it comes to PBR is that for materials to look correct you need correctly calibrated material values! Switching to a PBR model means that you need artists who truly understand the theory of PBR. You can ensure conservation of energy in code, which can protect against strange artefacts and over-saturation. But ultimately a renderer is only as good as the assets it has to render.
There are a lot of areas still to improve with both of our rendering engines, we’ll be documenting those major improvements in more blog posts like this. But by adopting PBR we shouldn’t have to change how we define or perceive materials, it also sets a good precedent that features we introduce in the future should also have a physical basis to them.
Example Material Renders