A comparison of metric learning loss functions for end-to-end speaker verification

5Citations
Citations of this article
39Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Despite the growing popularity of metric learningapproaches, very little work has attempted to perform a fair comparison of these techniques for speaker verification. We try to fill this gap and compare several metric learning loss functions in a systematic manner on the VoxCeleb dataset. The first family of loss functions is derived from the cross entropy loss (usually used for supervised classification) and includes the congenerous cosine loss, the additive angular margin loss, and the center loss. The second family of loss functions focuses on the similarity between training samples and includes the contrastive loss and the triplet loss. We show that the additive angular margin loss function outperforms all other loss functions in the study, while learning more robust representations. Based on a combination of SincNet trainable features and the x-vector architecture, the network used in this paper brings us a step closer to a truly end-to-end speaker verification system, when combined with the additive angular margin loss, while still being competitive with the x-vector baseline. In the spirit of reproducible research, we also release open source Python code for reproducing our results, and share pretrained PyTorch models on torch.hub that can be used either directly or after fine-tuning.

Cite

CITATION STYLE

APA

Coria, J. M., Bredin, H., Ghannay, S., & Rosset, S. (2020). A comparison of metric learning loss functions for end-to-end speaker verification. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12379 LNAI, pp. 137–148). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-59430-5_11

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free