Answer validation through textual entailment

1Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Ongoing research work on an Answer Validation System (AV) based on Textual Entailment and Question Answering has been presented. A number of answer validation modules have been developed based on Textual Entailment, Named Entity Recognition, Question-Answer type analysis, Chunk boundary module and Syntactic similarity module. These answer validation modules have been integrated using a voting technique. We combine the question and the answer into the Hypothesis (H) and the Supporting Text as Text (T) to identify the entailment relation as either "VALIDATED" or "REJECTED". The important features in the lexical Textual Entailment module are: WordNet based unigram match, bi-gram match and skip-gram. In the syntactic similarity module, the important features used are: subject-subject comparison, subject-verb comparison, object-verb comparison and cross subject-verb comparison. The precision, recall and f-score of the integrated AV system on the AVE 2008 English annotated test set have been observed as 0.66, 0.65 and 0.65 respectively that outperforms the best performing system at AVE 2008 in terms of f-score. © 2011 Springer-Verlag.

Cite

CITATION STYLE

APA

Pakray, P. (2011). Answer validation through textual entailment. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6716 LNCS, pp. 324–329). https://doi.org/10.1007/978-3-642-22327-3_48

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free