There is a need for methods and tools that facilitate the systematic exploration of novel artificial neural network models. While significant progress has been made in developing concise artificial neural networks that implement basic models of neural activation, connectivity and plasticity, limited success has been attained in creating neural networks that integrate multiple diverse models to produce highly complex neural systems. From a problem-solving perspective, there is a need for effective methods for combining different neural-network-based learning systems in order to solve complex problems. Different models may be more appropriate for solving different subproblems, and robust, systematic methods for combining those models may lead to more powerful machine learning systems. From a neuroscience modelling perspective, there is a need for effective methods for integrating different models to produce more robust models of the brain. These needs may be met through the development of meta-model languages that represent diverse neural models and the interactions between different neural elements. A meta-model language based on attribute grammars, the Network Generating Attribute Grammar Encoding, is presented, and its capability for facilitating automated search of complex combinations of neural components from different models is discussed. © 2011 Springer-Verlag Berlin Heidelberg.
CITATION STYLE
Hussain, T. S. (2011). A meta-model perspective and attribute grammar approach to facilitating the development of novel neural network models. Studies in Computational Intelligence, 358, 245–272. https://doi.org/10.1007/978-3-642-20980-2_8
Mendeley helps you to discover research relevant for your work.