Voxel-based real-time global illumination

Cosin Ayerbe, Alejandro
Full Text
embargat.txt embargoed access
Request a copy
When filling the form you are requesting a copy of the article, that is deposited in the institutional repository (DUGiDocs), at the autor or main autor of the article. It will be the same author who decides to give a copy of the document to the person who requests it, if it considers it appropriate. In any case, the UdG Library doesn’t take part in this process because it is not authorized to provide restricted articles.
Share
ENG- In this PhD we work in the area of real-time global illumination, proposing three algorithms which contribute to advance in this direction. There are several families of techniques within this area based on research published between the early 1990s and the early 2010s. Our work relies on two families specifically: Voxel-based and irradiance volume-based techniques. Voxel-based techniques use the voxel as basic element, which is the 3D equivalent of a texel or texture element, to represent the scene as a set of square blocks. Those blocks can have properties of the geometry contained within its enclosing volume like its reflectance or normal direction, and can be used as well as scene proxies where to store information related with the behavior of the light in the scene. By storing that information we also include irradiance-volume based elements. In this family of techniques the way to represent global illumination in a scene is to consider a grid of points, computing the light arriving at each point from all directions (also known as irradiance), accounting in this case only for indirect light. When a triangle within the 3D volume comprised by a set of grid points needs to compute its irradiance, the irradiance values at each grid point are interpolated. We do store indirect light irradiance in voxels generated by the scene geometry, and compute the indirect light at each triangle by interpolating irradiance between the closest voxels around that triangle. Sometimes, the amount of voxels generated in a scene can be prohibitively high. In our first algorithm we apply clustering techniques to the voxel representation of a scene. This allows limiting how many entities need to be taken into account to compute irradiance at a voxel; allowing to store in a cluster the irradiance arriving from the emitter to the voxels forming that cluster, which allows other voxels outside the cluster to reuse this information; and offering a multiresolution approach where irradiance computations for distant sets of voxels will use just the irradiance information from the clusters that contain them. Our second algorithm follows a similar but simplified approach, where each voxel uses ray tracing to find other voxels with geometry visible from the emitter and compute the irradiance arriving from them. We follow a divide-and-win approach where rays are traced in a scene representation containing only the dynamic scene objects, initially caching for each voxel any detected static scene objects by tracing rays only in a scene representation containing only the static scene objects (being this the voxel's static scene visibility), lowering the computational cost of ray tracing when performing updates. This second algorithm allows dynamic scene objects, generating each frame a voxelized representation of only the dynamic scene objects, computing irradiance for all those voxels but also for the voxels present in a neighborhood around them, avoiding any temporal instability for any scene dynamic object. This second algorithm achieves a much higher quality than any other academic or state of the art commercial technique as exposed in our numerous test scenes. Our third algorithm extends the second one, adding glossy and specular reflections, offering real-time global illumination on dynamic scenes with a complete material model for opaque surfaces ​
​L'accés als continguts d'aquesta tesi queda condicionat a l'acceptació de les condicions d'ús establertes per la següent llicència Creative Commons: http://creativecommons.org/licenses/by-sa/4.0/