How do children learn a verb's argument structure when their input contains nonbasic clauses that obscure verb transitivity? Here we present a new model that infers verb transitivity by learning to filter out non-basic clauses that were likely parsed in error. In simulations with childdirected speech, we show that this model accurately categorizes the majority of 50 frequent transitive, intransitive and alternating verbs, and jointly learns appropriate parameters for filtering parsing errors. Our model is thus able to filter out problematic data for verb learning without knowing in advance which data need to be filtered.
CITATION STYLE
Perkins, L., Feldman, N. H., & Lidz, J. (2017). Learning an input filter for argument structure acquisition. In CMCL 2017 - Cognitive Modeling and Computational Linguistics at the 15th Conference of the European Chapter of the Association for Computational Linguistics, EACL 2017 - Proceedings (pp. 11–19). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/w17-0702
Mendeley helps you to discover research relevant for your work.