Autonomous Monitoring of Wildfires with Vision-Equipped UAS and Temperature Sensors via Evidential Reasoning

  • Soderlund A
  • Kumar M
N/ACitations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This chapter describes a novel approach for autonomously estimating the state of an evolving wildfire in real-time. Information fusion regarding the fire spread employs the geospatial probabilistic state of the current fire spread in order to forecast to a future wildfire distribution whereupon the prior belief state is updated from the observations of two distinct types of sensing agents: (i) static temperature sensors embedded on the surface, and (ii) mobile vision sensors mounted on un-manned aerial vehicles (UAVs). The guidance and information fusion methods that direct the mobile vision sensors to useful sensing locations utilize the Dempster-Shafer theory of probabilistic evidential reasoning. The recursive bi-directional feedback procedure between the mobile vision agents and the forecasting operation exemplifies the Dynamic Data Driven Applications Systems (DDDAS) framework and is evaluated via simulation on its ability to estimate an expanding wildfire in real-time. The results indicate definitive improvement with the inclusion of mobile sensors versus conventional weather-based forecasts alone, and a number of enhancements are noted that can strengthen the current UAV path-planning algorithm.

Cite

CITATION STYLE

APA

Soderlund, A., & Kumar, M. (2023). Autonomous Monitoring of Wildfires with Vision-Equipped UAS and Temperature Sensors via Evidential Reasoning. In Handbook of Dynamic Data Driven Applications Systems (pp. 475–524). Springer International Publishing. https://doi.org/10.1007/978-3-031-27986-7_18

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free