Recent studies on grammatical inference have demonstrated the benefits of "distributional learning" for learning context-free and context-sensitive languages. Distributional learning models and exploits the relation between strings and contexts in the language of the learning target. There are two main approaches. One, which we call primal, constructs nonterminals whose language is characterized by strings. The other, which we call dual, uses contexts to characterize the language of a nonterminal of the conjecture grammar. This paper demonstrates and discusses the duality of those approaches by presenting some powerful learning algorithms along the way. © 2011 Springer-Verlag.
CITATION STYLE
Yoshinaka, R. (2011). Towards dual approaches for learning context-free grammars based on syntactic concept lattices. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6795 LNCS, pp. 429–440). https://doi.org/10.1007/978-3-642-22321-1_37
Mendeley helps you to discover research relevant for your work.