Efficient Gradient-Based Neural Architecture Search for End-to-End ASR

6Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Neural architecture search (NAS) has been successfully applied to tasks like image classification and language modeling for finding efficient high-performance network architectures. In ASR field especially end-to-end ASR, the related research is still in its infancy. In this work, we focus on applying NAS on the most popular manually designed model: Conformer, and propose an efficient ASR model searching method that benefits from the natural advantage of differentiable architecture search (Darts) in reducing computational overheads. We fuse Darts mutator and Conformer blocks to form a complete search space, within which a modified architecture called Darts-Conformer cell is found automatically. The entire searching process on AISHELL-1 dataset costs only 0.7 GPU days. Replacing the Conformer encoder by stacking searched architecture, we get an end-to-end ASR model (named as Darts-Conformner) that outperforms the Conformer baseline by 4.7% relatively on the open-source AISHELL-1 dataset. Besides, we verify the transferability of the architecture searched on a small dataset to a larger 2k-hour dataset.

Cite

CITATION STYLE

APA

Shi, X., Zhou, P., Chen, W., & Xie, L. (2021). Efficient Gradient-Based Neural Architecture Search for End-to-End ASR. In ICMI 2021 Companion - Companion Publication of the 2021 International Conference on Multimodal Interaction (pp. 91–96). Association for Computing Machinery, Inc. https://doi.org/10.1145/3461615.3491109

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free