We present an approach for real-time rendering of complex 3D scenes consisting of millions of polygons on limited graphics hardware. In a preprocessing step, powerful hardware is used to gain fine granular global visibility information of a scene using an adaptive sampling algorithm. Additively the visual influence of each object on the eventual rendered image is estimated. This influence is used to select the most important objects to display in our approximative culling algorithm. After the visibility data is compressed to meet the storage capabilities of small devices, we achieve an interactive walkthrough of the Power Plant scene on a standard netbook with an integrated graphics chipset. © 2010 Springer-Verlag.
CITATION STYLE
Eikel, B., Jähn, C., & Fischer, M. (2010). Preprocessed global visibility for real-time rendering on low-end hardware. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6453 LNCS, pp. 622–633). https://doi.org/10.1007/978-3-642-17289-2_60
Mendeley helps you to discover research relevant for your work.