FedPerC: Federated Learning for Language Generation with Personal and Context Preference Embeddings

2Citations
Citations of this article
15Readers
Mendeley users who have this article in their library.

Abstract

Federated learning is a training paradigm that learns from multiple distributed users without aggregating data on a centralized server, promising the ability to deploy machine-learning to a diverse population of users without first collecting large, labeled datasets. As federated learning involves averaging gradient updates across a decentralized population, there is a growing need for personalization of federated learning systems (i.e. conversational agents must personalize to individual users and the context of an interaction). In this work, we propose a new direction for personalization research within federated learning, leveraging both personal embeddings and shared context embeddings. We also present an approach to predict these “preference” embeddings, enabling personalization without backpropagation. Compared to state-of-the-art personalization baselines, our approach achieves a 50% improvement in test-time perplexity using 0.001% of the memory required by baseline approaches, and achieving greater sample- and compute-efficiency.

Cite

CITATION STYLE

APA

Silva, A., Tambwekar, P., & Gombolay, M. (2023). FedPerC: Federated Learning for Language Generation with Personal and Context Preference Embeddings. In EACL 2023 - 17th Conference of the European Chapter of the Association for Computational Linguistics, Findings of EACL 2023 (pp. 839–852). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.findings-eacl.64

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free