Science missions have limited lifetimes, necessitating an efficient investigation of the field site. The efficiency of onboard cameras, critical for planning, is limited by the need to downlink images to Earth for every decision. Recent advances have enabled rovers to take follow-up actions without waiting hours or days for new instructions. We propose using built-in processing by the instrument itself for adaptive data collection, faster reconnaissance, and increased mission science yield. We have developed a machine learning pixel classifier that is sensitive to texture differences in surface materials, enabling more sophisticated onboard classification than was previously possible. This classifier can be implemented in a Field Programmable Gate Array (FPGA) for maximal efficiency and minimal impact on the rest of the system's functions. In this paper, we report on initial results from applying the texture-sensitive classifier to three example analysis tasks using data from the Mars Exploration Rovers. Key Points Smart instruments can analyze their own data for science investigations Random forest classifiers can effectively address texture-based image tasks Future spacecraft can train and deploy their own image classifiers. © 2013. American Geophysical Union. All Rights Reserved.
CITATION STYLE
Wagstaff, K. L., Thompson, D. R., Abbey, W., Allwood, A., Bekker, D. L., Cabrol, N. A., … Ortega, K. (2013). Smart, texture-sensitive instrument classification for in situ rock and layer analysis. Geophysical Research Letters, 40(16), 4188–4193. https://doi.org/10.1002/grl.50817
Mendeley helps you to discover research relevant for your work.