Developments in real-time ray tracing techniques are opening up new possibilities in the VFX industry and revolutionising content creation.
Hardware manufacturers are creating devices that enable new capabilities in graphics, which can be utilised in interactive gaming, TV and VR, and animation.
Ray tracing is exactly what it sounds like. The process of calculating how light interacts with objects and materials in computer generated imagery. Particularly important in situations where there is movement as every single frame a scene moves, light interacts differently with the viewers eye.

(garagefarm.net, n.d.)
Previously renders could only be completed on huge render farms, but now there is much less waiting time, and this means that design progression is faster, and higher quality lighting, reflections and shadows can be achieved, and even more realistic ambient occlusion, scattering are possible.

One example of this technology is the Turing Architecture GPU created by leading manufacturer Nvidia. This “speeds up bounding volume hierarchy traversal and ray-triangle intersection testing”
(Meister et al., 2021)
This means that multiprocessors can improve shading speed and execution without needing instruction slots for each ray cast.
In other words, the heavy lifting of ray-tracing light can be picked up by graphics hardware usually needed for real time modelling. Light is traditionally handled by the CPU due to the complexities of the calculations involved.
Turing’s tensor cores enable users to improve AI denoising, so they can create better images, and more quickly. The viewer experience is much better, in that they don’t notice the effects and ‘believability’ increases. This means that ray-tracing light can be said to be happening in “real time” as it does in the real world.
Here is a demo by Nvidia displaying the ray tracing capabilities of their GPU’s.
The rendering process now just needs a single GPU with enough memory, and even huge, intricate scenes can be handled without long waiting times.
Uses:
· Enhances gaming experiences (eg reflective surfaces are more realistic)
· Animation pipelines – creators can work on light geometry in real time
· Creating believable digital humans in film
High fidelity ray tracing creates fantastic realism, but traditionally has been very time intensive. When rendering things that do not have an interactive dynamic component, such as a film or cinematic, this is not a problem. However in VR and video games there is always a dynamic component so advances in technologies that make the process more realistic in real time are very exciting.
(garagefarm.net, n.d.)
A further development of this is in combination with another new technology in the world of VFX . (LED Wall) Virtual Production is the process of using LED screens as backdrops in film sets instead of green screens. This pertains to having Realtime graphics rendered interactively with actors on a set. In combination with technologies such as Unreal Engine, realistic scenes can be used as backdrops to actors that change as needed, made possible by the developments in ray tracing capabilities.
(fxguide, 2020)
References
garagefarm.net. (n.d.). How ray tracing impressively elevated the effects of 3D rendering. [online] Available at: https://garagefarm.net/blog/how-ray-tracing-has-elevated-the-already-impressive-effects-of-3d-rendering [Accessed 27 Jan. 2022].
Meister, D., Ogaki, S., Benthin, C., Doyle, M., Guthe, M. and Bittner, J. (2021). Number 2 STAR -State of The Art Report. [online] 40. Available at: https://meistdan.github.io/publications/bvh_star/paper.pdf.
fxguide. (2020). Art of (LED Wall) Virtual Production Sets, Part Two: “How you make one.” [online] Available at: https://www.fxguide.com/fxfeatured/art-of-led-wall-virtual-production-sets-part-two-how-you-make-one/.