The learning dynamics close to the initial conditions of an on-line Hebbian ICA algorithm has been studied. For large input dimension the dynamics can be described by a diffusion equation.A surprisingly large number of examples and unusually low initial learning rate are required to avoid a stochastic trapping state near the initial conditions. Escape from this state results in symmetry breaking and the algorithm therefore avoids trapping in plateau-like fixed points which have been observed in other learning algorithms. © Springer-Verlag Berlin Heidelberg 2002.
CITATION STYLE
Basalyga, G., & Rattray, M. (2002). Dynamics of ICA for high-dimensional data. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 2415 LNCS, pp. 1112–1118). Springer Verlag. https://doi.org/10.1007/3-540-46084-5_180
Mendeley helps you to discover research relevant for your work.