This paper introduces the notion of the variadic neural network (VNN). The inputs to a variadic network are an arbitrary-length list of n-tuples of real numbers, where n is fixed. In contrast to a recurrent network which processes a list sequentially, typically being affected more by more recent list elements, a variadic network processes the list simultaneously and is affected equally by all list elements. Formally speaking, the network can be seen as instantiating a function on a multiset along with a member of that multiset. I describe a simple implementation of a variadic network architecture, the multi-layer variadic perceptron (MLVP), and present experimental results showing that such a network can learn various variadic functions by back-propagation. © Springer-Verlag Berlin Heidelberg 2007.
CITATION STYLE
McGregor, S. (2007). Neural network processing for multiset data. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4668 LNCS, pp. 460–470). Springer Verlag. https://doi.org/10.1007/978-3-540-74690-4_47
Mendeley helps you to discover research relevant for your work.