Range sensors simulation using GPU ray tracing

5Citations
Citations of this article
16Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper the GPU-accelerated range sensors simulation is discussed. Range sensors generate large amount of data per second and to simulate these high-performance simulation is needed.We propose to use parallel ray tracing on graphics processing units to improve the performance of range sensors simulation. The multiple range sensors are described and simulated using NVIDIA OptiX ray tracing engine. This work is focused on the performance of the GPU acceleration of range images simulation in complex environments. Proposed method is tested using several state-of-the-art ray tracing datasets. The software is publicly available as an open-source project SensorSimRT.

Author supplied keywords

Cite

CITATION STYLE

APA

Majek, K., & Bedkowski, J. (2016). Range sensors simulation using GPU ray tracing. In Advances in Intelligent Systems and Computing (Vol. 403, pp. 831–840). Springer Verlag. https://doi.org/10.1007/978-3-319-26227-7_78

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free