Formal grammars have been successfully simulated through Artificial Neural Networks. This fact has established a new approach to the problem of Grammatical Inference. First, [Pollack,91], [Giles,92] and [Watrous,92] trained network architectures from positive samples or positive and negative samples generated by regular grammars to accept or reject new strings. On the other hand, [Servan,88] and [Smith,891 used nets in which strings were fed character by character, so that the possible successors for each character were predicted. Later, [Servan,91] suggested that these networks could also predict the generation probabilities of each character in the strings generated by Stochastic Regular Grammars. Our present work shows empirical evidence supporting this suggestion.
CITATION STYLE
Castaño, M. A., Casacuberta, F., & Vidal, E. (1993). Simulation of stochastic regular grammars through simple recurrent networks. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 686, pp. 210–215). Springer Verlag. https://doi.org/10.1007/3-540-56798-4_149
Mendeley helps you to discover research relevant for your work.