LIBKGE1 is an open-source PyTorch-based library for training, hyperparameter optimization, and evaluation of knowledge graph embedding models for link prediction. The key goals of LIBKGE are to enable reproducible research, to provide a framework for comprehensive experimental studies, and to facilitate analyzing the contributions of individual components of training methods, model architectures, and evaluation methods. LIBKGE is highly configurable and every experiment can be fully reproduced with a single configuration file. Individual components are decoupled to the extent possible so that they can be mixed and matched with each other. Implementations in LIBKGE aim to be as efficient as possible without leaving the scope of Python/Numpy/PyTorch. A comprehensive logging mechanism and tooling facilitates indepth analysis. LIBKGE provides implementations of common knowledge graph embedding models and training methods, and new ones can be easily added. A comparative study (Ruffinelli et al., 2020) showed that LIBKGE reaches competitive to state-of-the-art performance for many models with a modest amount of automatic hyperparameter tuning.
Mendeley helps you to discover research relevant for your work.
CITATION STYLE
Broscheit, S., Ruffinelli, D., Kochsiek, A., Betz, P., & Gemulla, R. (2020). LibKGE A knowledge graph embedding library for reproducible research. In EMNLP 2020 - Conference on Empirical Methods in Natural Language Processing, Proceedings of Systems Demonstrations (pp. 165–174). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.emnlp-demos.22