Abstract
We approach the recognition of textual entailment using logical semantic representations and a theorem prover. In this setup, lexical divergences that preserve semantic entailment between the source and target texts need to be explicitly stated. However, recognising subsentential semantic relations is not trivial. We address this problem by monitoring the proof of the theorem and detecting unprovable sub-goals that share predicate arguments with logical premises. If a linguistic relation exists, then an appropriate axiom is constructed on-demand and the theorem proving continues. Experiments show that this approach is effective and precise, producing a system that outperforms other logicbased systems and is competitive with state-of-the-art statistical methods.
Cite
CITATION STYLE
Martínez-Gómez, P., Mineshima, K., Miyao, Y., & Bekki, D. (2017). On-demand injection of lexical knowledge for recognising textual entailment. In 15th Conference of the European Chapter of the Association for Computational Linguistics, EACL 2017 - Proceedings of Conference (Vol. 1, pp. 710–720). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/e17-1067
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.