Continuous Time Recurrent Neural Networks for Grammatical Induction

  • Chen J
  • Wermter S
N/ACitations
Citations of this article
13Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper we explore continuous time recurrent networks for grammatical induction. A higher-level generating/processing scheme can be used to tackle the grammar induction problem. Experiments are performed on several types of grammars, including a family of languages known as Tomita languages and a context-free language. The system and the experiments demonstrate that continuous time recurrent networks can learn certain grammatical induction tasks.

Cite

CITATION STYLE

APA

Chen, J., & Wermter, S. (1998). Continuous Time Recurrent Neural Networks for Grammatical Induction (pp. 381–386). https://doi.org/10.1007/978-1-4471-1599-1_56

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free