Authors
Yuna Jeong
Description
This dissertation presents a GPU-based rendering algorithm for real-time defocus blur and bokeh effects, which significantly improve perceptual realism of synthetic images and can emphasize user’s attention. The defocus blur algorithm combines three distinctive techniques:(1) adaptive discrete geometric level of detail (LOD), made popping-free by blending visibility samples across the two adjacent geometric levels;(2) adaptive visibility/shading sampling via sample reuse;(3) visibility supersampling via height-field ray casting. All the three techniques are seamlessly integrated to lower the rendering cost of smooth defocus blur with high visibility sampling rates, while maintaining most of the quality of brute-force accumulation buffering. Also, the author presents a novel parametric model to include expressive chromatic aberrations in defocus blur rendering and its effective implementation using the accumulation buffering. The model modifies the thinlens model to adopt the axial and lateral chromatic aberrations, which allows us to easily extend them with nonlinear and artistic appearances beyond physical limits. For the dispersion to be continuous, we employ a novel unified 3D sampling scheme, involving both the lens and spectrum. Further, the author shows a spectral equalizer to emphasize particular dispersion ranges. As a