FastDARTSDet: Fast Differentiable Architecture Joint Search on Backbone and FPN for Object Detection

6Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.

Abstract

Neural architecture search (NAS) is a popular branch of automatic machine learning (AutoML), which aims to search for efficient network structures. Many prior works have explored a wide range of search algorithms for classification tasks, and have achieved better performance than manually designed network architectures. However, few works have explored NAS for object detection tasks due to the difficulty to train convolution neural networks from scratch. In this paper, we propose a framework, named as FastDARTSDet, to directly search on a larger-scale object detection dataset (MS-COCO). Specifically, we propose to apply differentiable architecture search method (DARTS) to jointly search backbone and feature pyramid network (FPN) architectures for object detection task. Extensive experimental results on MS-COCO show the efficient and efficacy of our method. Specifically, our method achieves 40.0% mean average precision (mAP) on the test set, outperforming many recent NAS methods.

Cite

CITATION STYLE

APA

Wang, C., Wang, X., Wang, Y., Hu, S., Chen, H., Gu, X., … He, T. (2022). FastDARTSDet: Fast Differentiable Architecture Joint Search on Backbone and FPN for Object Detection. Applied Sciences (Switzerland), 12(20). https://doi.org/10.3390/app122010530

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free