Inverse Rendering Methods for Hardware-Accelerated Display of Parameterized Image Spaces

Ziyad S. Hakura, Ph.D. dissertation, Stanford University, October 2001.

Abstract:

One of the central problems in computer graphics is real-time rendering of physically illuminated, dynamic environments. Though the computation needed is beyond current capability, specialized graphics hardware that renders texture-mapped polygons continues to get cheaper and faster. We exploit this hardware to decompress "animations" computed offline using a photorealistic image renderer. The decoded imagery retains the full gamut of stochastic ray tracing effects, including indirect lighting with reflections, refractions, and shadows.

Rather than 1D time, our animations are parameterized by two or more arbitrary variables representing viewpoint positions, lighting changes and object motions. To best match the graphics hardware rendering to the input ray-traced imagery, we describe a novel method to infer parameterized texture maps for each object by modeling the hardware as a linear system and then performing least-squares optimization. The parameterized textures are compressed as a multidimensional Laplacian pyramid on fixed size blocks of parameter space. This scheme captures the coherence in animations and, unlike previous work, decodes directly into texture maps that load into hardware with a few, simple image operations. High-quality results are demonstrated at compression ratios up to 800:1 with interactive playback on current consumer graphics cards.

To enable plausible movement away from and between the pre-rendered viewpoint samples, we extend the idea of parametric textures to parametric environment maps. Segmenting the environment into layers, and picking simple environmental geometry that closely matches the actual geometry of the environment better approximates how reflections move as the view changes. Unlike traditional environment maps, we achieve local effects like self-reflections and parallax in the reflected imagery.

Finally, we introduce hybrid rendering, a scheme that dynamically ray traces the local geometry of refractive objects, but approximates more distant geometry by layered, parameterized environment maps. To limit computation, we use a greedy ray path shading model that prunes the binary ray tree generated by refractive objects to form just two ray paths. We also restrict ray queries to triangle vertices, but perform adaptive tessellation to shoot additional rays where neighboring ray paths differ sufficiently. We demonstrate highly specular glass objects at a significantly lower and more predictable cost than ray-tracing, and anticipate future support for local ray-tracing in graphics hardware will make this approach ideal for real-time rendering of realistic reflective and refractive objects.

Dissertation:


[email protected]