Overview of the answer validation exercise 2006

33Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The first Answer Validation Exercise (AVE) has been launched at the Cross Language Evaluation Forum 2006. This task is aimed at developing systems able to decide whether the answer of a Question Answering system is correct or not. The exercise is described here together with the evaluation methodology and the systems results. The starting point for the AVE 2006 was the reformulation of Answer Validation as a Recognizing Textual Entailment problem, under the assumption that the hypothesis can be automatically generated instantiating hypothesis patterns with the QA systems' answers. 11 groups have participated with 38 runs in 7 different languages. Systems that reported the use of Logic have obtained the best results in their respective subtasks. © Springer-Verlag Berlin Heidelberg 2007.

Cite

CITATION STYLE

APA

Peñas, A., Rodrigo, Á., Sama, V., & Verdejo, F. (2007). Overview of the answer validation exercise 2006. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4730 LNCS, pp. 257–264). Springer Verlag. https://doi.org/10.1007/978-3-540-74999-8_32

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free