Aaron Lefohn’s Post

Another great collaboration between NVIDIA product engineers and researchers. Transforming a SIGGRAPH 2021 paper into a game rendering product. This algorithm trains a neural network "live" during real-time rendering to asynchronously update a neural approximation to radiance throughout the scene. During rendering, the renderer traces short paths (1-2 bounces) and estimates the light that the full length path would have found by using neural inference to perform a cache lookup. Live neural network training. Neural network inference during rendering. AI is transforming the rendering pipeline.

View profile for Thomas Müller, graphic

Principal Research Scientist at NVIDIA

Our neural radiance cache is finally available as a game dev SDK! https://lnkd.in/dR23wSXA If you work on a real-time path tracer, this neural cache can shorten your paths and thereby reduce noise & cost. We've seen efficiency gains of up to 50x in scenes that are super tough for non-neural caches, like the one in the video. It's also not "just" an implementation of the original research paper (https://lnkd.in/g5SZgfVu) but also comes with improvements, like Instant NGP's hash encoding (https://lnkd.in/dQC2f9Yp bottom of page) and turbo-charged neural networks.

To view or add a comment, sign in

Explore topics