Abstract
Given remote sensing datasets in a spatial domain, we aim to detect geospatial objects with minimum bounding rectangles (i.e., angle-aware) leveraging deep learning frameworks. Geospatial objects (e.g., buildings, vehicles, farms) provide meaningful information for a variety of societal applications, including urban planning, census, sustainable development, security surveillance, agricultural management, etc. The detection of these objects are challenging because their directions are often heavily mixed and not parallel to the orthogonal directions of an image frame due to topography, planning, etc. In addition, there is very limited training data with angle information for most types of objects. In related work, state-of-the-art deep learning frameworks detect objects using orthogonal bounding rectangles (i.e., sides are parallel to the sides of an input image), so they cannot identify the directions of objects and generate loose rectangular bounds on objects. We propose an Unsupervised Augmentation (UA) framework to detect geospatial objects with general minimum bounding rectangles (i.e., with angles). The UA framework contains two schemes, namely a ROtation-Vector (ROV) based scheme and a context-based scheme. The schemes completely avoid the need for: (1) additional ground-truth data with annotated angles; (2) restructuring of existing network architectures; and (3) re-training. Experimental results show that the UA framework can well approximate the angles of objects and generate much tighter bounding boxes on objects.
Author supplied keywords
Cite
CITATION STYLE
Xie, Y., Bhojwani, R., Shekhar, S., & Knight, J. (2018). An unsupervised augmentation framework for deep learning based geospatial object detection: A summary of results. In GIS: Proceedings of the ACM International Symposium on Advances in Geographic Information Systems (pp. 349–358). Association for Computing Machinery. https://doi.org/10.1145/3274895.3274901
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.