This paper presents a neural framework of untied independent modules, used here for integrating off the shelf knowledge sources such as language models, lexica, POS information, and dependency relations. Each knowledge source is implemented as an independent component that can interact and share information with other knowledge sources. We report proof of concept experiments for several standard sentiment analysis tasks and show that the knowledge sources interoperate effectively without interference. As a second use-case, we show that the proposed framework is suitable for optimizing BERT-like language models even without the help of external knowledge sources. We cast each Transformer layer as a separate module and demonstrate performance improvements from this explicit integration of the different information encoded at the different Transformer layers.
CITATION STYLE
Bagherzadeh, P., & Bergler, S. (2021). Competing Independent Modules for Knowledge Integration and Optimization. In Findings of the Association for Computational Linguistics, Findings of ACL: EMNLP 2021 (pp. 4416–4425). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.findings-emnlp.376
Mendeley helps you to discover research relevant for your work.