Temporal finite-state machines: A novel framework for the general class of dynamic networks

2Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This is a follow up paper that integrates our recent published work discussing the implementation of brain-inspired information processing system by means of finite-state machines. Using a formerly presented implementation of the liquid-state machines framework using a novel synaptic model, this study shows that such a network represents and processes input information internally using transitions among a set of discrete and finite neural temporal states. The introduced framework is coined the temporal finite-state machine (tFSM). The proposed work involves a new definition for a "neural state" within a dynamic network and it discusses the computational capacity of the tFSM. This paper presents novel perspectives and open new avenues in importing the behaviour of spiking neural networks into the classical computational model of finite-state machines. © 2012 Springer-Verlag.

Cite

CITATION STYLE

APA

El-Laithy, K., & Bogdan, M. (2012). Temporal finite-state machines: A novel framework for the general class of dynamic networks. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 7664 LNCS, pp. 425–434). https://doi.org/10.1007/978-3-642-34481-7_52

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free