Abstract
This paper presents a new technique for realtime rendering refraction and caustics effects. The algorithm can directly render complex objects represented by polygonal meshes without any precalculation, and allows the objects to be deformed dynamically through user interactions. Also, caustic patterns are rendered in the depth texture space. We accurately trace the photons' path and calculate the energy carried by the photons. As a result the caustic patterns are calculated without post-processing and temporal filtering over neighboring frames. Our technique can handle both the convex objects and concave objects, for the convex objects, the ray-convex-surface intersection is calculated by using a binary search algorithm; for the concave objects, the ray-concave-surface intersection is done by a linear search followed by a binary-search refinement step. The caustics can be also rendered for non-uniform deformation of both refractive object and receiver surface, allowing the interactive change of light and camera in terms of position and direction.Citation
Sheng, B., Sun, H., Liu, B. and Wu, E. (2009) 'GPU-based refraction and caustics rendering on depth textures', ACM International Conference on Virtual Reality Continuum and its Applications (ACM VRCIA 2009). New York: ACM, pp.139-144.Publisher
ACMAdditional Links
http://portal.acm.org/citation.cfm?doid=1670252.1670282Type
Conference papers, meetings and proceedingsLanguage
enISBN
9781605589121ae974a485f413a2113503eed53cd6c53
10.1145/1670252.1670282
