WebChild 2.0: Fine-grained commonsense knowledge distillation

66Citations
Citations of this article
110Readers
Mendeley users who have this article in their library.

Abstract

Despite important progress in the area of intelligent systems, most such systems still lack commonsense knowledge that appears crucial for enabling smarter, more human-like decisions. In this paper, we present a system based on a series of algorithms to distill fine-grained disambiguated commonsense knowledge from massive amounts of text. Our WebChild 2.0 knowledge base is one of the largest commonsense knowledge bases available, describing over 2 million disambiguated concepts and activities, connected by over 18 million assertions.

Cite

CITATION STYLE

APA

Tandon, N., De Melo, G., & Weikum, G. (2017). WebChild 2.0: Fine-grained commonsense knowledge distillation. In ACL 2017 - 55th Annual Meeting of the Association for Computational Linguistics, Proceedings of System Demonstrations (pp. 115–120). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/P17-4020

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free