Airborne laser scanning (ALS) data is one of the most commonly used data for terrain products generation. Filtering ground points is a prerequisite step for ALS data processing. Traditional filtering methods mainly use handcrafted features or predefined classification rules with preprocessing/post-processing operations to filter ground points iteratively, which is empirical and cumbersome. Deep learning provides a new approach to solve classification and segmentation problems because of its ability to self-learn features, which has been favored in many fields, particularly remote sensing. In this article, we proposed a point-based fully convolutional neural network (PFCN) which directly consumed points with only geometric information and extracted both point-wise and tile-wise features to classify each point. The network was trained with 37449157 points from 14 sites and evaluated on 6 sites in various forested environments. Additionally, the method was compared with five widely used filtering methods and one of the best point-based deep learning methods (PointNet++). Results showed that the PFCN achieved the best results in terms of mean omission error (T1 = 1.10%), total error (Te = 1.73%), and Kappa coefficient (93.88%), but ranked second for the root mean square error of the digital Terrain model caused by the worst commission error. Additionally, our method was on par with or even better than PointNet++ in accuracy. Moreover, the method consumes one-third of the computational resource and one-seventh of the training time. We believe that PFCN is a simple and flexible method that can be widely applied for ground point filtering.
CITATION STYLE
Jin, S., Su, Y., Zhao, X., Hu, T., & Guo, Q. (2020). A Point-Based Fully Convolutional Neural Network for Airborne LiDAR Ground Point Filtering in Forested Environments. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 13, 3958–3974. https://doi.org/10.1109/JSTARS.2020.3008477
Mendeley helps you to discover research relevant for your work.