Recently, the use of small UAVs for monitoring agricultural land areas has been increasingly used by agricultural producers in order to improve crop yields. However, correctly interpreting the collected imagery data is still a challenging task. In this study, an automated pipeline for monitoring C. Annuum crops based on a deep learning model is implemented. The system is capable of performing inferences on the health status of individual plants, and to determine their locations and shapes in a georeferenced orthomosaic. Accuracy achieved on the classification task was 94.5. AP values among classes were in the range of (Formula presented.) for plant location boxes, and in (Formula presented.) for foliar area predictions. The methodology requires only RGB images, and so, it can be replicated for the monitoring of other types of crops by only employing consumer-grade UAVs. A comparison with random forest and large-scale mean shift segmentation methods which use predetermined features is presented. NDVI results obtained with multispectral equipment are also included.
CITATION STYLE
Sosa-Herrera, J. A., Alvarez-Jarquin, N., Cid-Garcia, N. M., López-Araujo, D. J., & Vallejo-Pérez, M. R. (2022). Automated Health Estimation of Capsicum annuum L. Crops by Means of Deep Learning and RGB Aerial Images. Remote Sensing, 14(19). https://doi.org/10.3390/rs14194943
Mendeley helps you to discover research relevant for your work.