Real-time photorealistic rendering

Randomcontrol, the makers of fryrender, recently announced “fryrenderRT” the first commercial unbiased render engine with real-time visualization capabilities. They provide a player app that allows you to move the camera freely within a pre-computed scene and still have view-dependent glossy surfaces behaving accurately; pre-computing scenes with unbiased lighting may take up to hours or even days.

Here’s a sneak peek showing off glossy materials:

From what I’ve gathered it seems to work by storing global lighting information at each point (vertex/texel), much like the Precomputed Radiance Transfer techniques currently used in games. These tend to focus on Spherical Harmonic encoding due to efficient representation of low frequency lighting; it’s possible that RC’s technique is similar but instead based on encoding of nonlinear wavelets or gaussians which have been shown to work well for all-frequency relighting.

What does this mean for games?
- The video above runs on a mid-range ATI 4850 at 20 frames per second which is VERY promising.
- Static geometry is also a limitation for light maps.
- The high offline rendering times already exist when developing games that rely on high-quality directional or SH-based lightmaps. However, a biased version of the offline renderer could help reduce the hardware costs and generate good enough results for most games.
- Local memory requirements are certainly much higher than lightmaps. However, more compact representations trading quality for memory footprint could be used. e.g. partial wavelet coefficients.
- Local memory on GPUs will eventually outpace or be merged with system ram. A virtual page-based approach could also be used to help keep most of the data in disk.

I believe this technique has potential for use in games in the near future. These could very well be the next generation light maps.