Efficient Representation in 3D Environment Modeling for Planetary Robotic Exploration

  • Vaskevicius N
  • Birk A
  • Pathak K
 et al. 
  • 8


    Mendeley users who have this article in their library.
  • N/A


    Citations of this article.


Good situation awareness is an absolute must when operating mobile robots for planetary ex-ploration. 3D sensing and modeling data gathered by the robot are hence crucial for the operator. But standard methods based on stereo vision have their limitations, especially in scenarios where there is no or only very limited visibility, e.g., due to extreme light conditions. 3D Laser Range Finders (3D-LRF) provide an interesting alternative, especially as they can provide very accurate, high resolution data at very high sampling rates. But the more 3D range data is acquired, the harder it becomes to transmit the data to the operator station. Here, a fast and robust method to fit planar surface patches into the data is presented. The usefulness of the approach is demonstrated in two different sets of experiments. The first set is based on data from our participation at the ESA Lunar Robotics Challenge 2008. The second one is based on data from a Velodyne 3D-LRF in a high fidelity simulation with ground truth data from Mars.

Author-supplied keywords

  • 3D mapping
  • plane fitting
  • planetary exploration
  • space robotics
  • surface representation

Get free article suggestions today

Mendeley saves you time finding and organizing research

Sign up here
Already have an account ?Sign in

Find this document


  • Narunas Vaskevicius

  • Andreas Birk

  • Kaustubh Pathak

  • Sören Schwertfeger

Cite this document

Choose a citation style from the tabs below

Save time finding and organizing research with Mendeley

Sign up for free