Sentence embeddings enable us to capture the semantic similarity of short texts. Most sentence embedding models are trained for general semantic textual similarity tasks. Therefore, to use sentence embeddings in a particular domain, the model must be adapted to it in order to achieve good results. Usually, this is done by fine-tuning the entire sentence embedding model for the domain of interest. While this approach yields state-of-the-art results, all of the model's weights are updated during finetuning, making this method resource-intensive. Therefore, instead of fine-tuning entire sentence embedding models for each target domain individually, we propose to train lightweight adapters. These domain-specific adapters do not require fine-tuning all underlying sentence embedding model parameters. Instead, we only train a small number of additional parameters while keeping the weights of the underlying sentence embedding model fixed. Training domain-specific adapters allows always using the same base model and only exchanging the domain-specific adapters to adapt sentence embeddings to a specific domain. We show that using adapters for parameter-efficient domain adaptation of sentence embeddings yields competitive performance within 1% of a domainadapted, entirely fine-tuned sentence embedding model while only training approximately 3.6% of the parameters.
CITATION STYLE
Schopf, T., Schneider, D. N., & Matthes, F. (2023). Efficient Domain Adaptation of Sentence Embeddings Using Adapters. In International Conference Recent Advances in Natural Language Processing, RANLP (pp. 1046–1053). Incoma Ltd. https://doi.org/10.26615/978-954-452-092-2_112
Mendeley helps you to discover research relevant for your work.