Neural networks : a comprehensive foundation

  • Haykin S
  • 3

    Readers

    Mendeley users who have this article in their library.
  • N/A

    Citations

    Citations of this article.

Abstract

1. Introduction -- 2. Learning Process -- 3. Correlation Matrix Memory -- 4. Perceptron -- 5. Least-Mean-Square Algorithm -- 6. Multilayer Perceptrons -- 7. Radial-Basis Function Networks -- 8. Recurrent Networks Rooted in Statistical Physics -- 9. Self-Organizing Systems I: Hebbian Learning -- 10. Self-Organizing Systems II: Competitive Learning -- 11. Self-Organizing Systems III: Information-Theoretic Models -- 12. Modular Networks -- 13. Temporal Processing -- 14. Neurodynamics -- 15. VLSI Implementations of Neural Networks -- Appendix A: Pseudoinverse Matrix Memory -- Appendix B: A General Tool for Convergence Analysis of Stochastic Approximation Algorithms -- Appendix C: Statistical Thermodynamics -- Appendix D: Fokker-Planck Equation.

Get free article suggestions today

Mendeley saves you time finding and organizing research

Sign up here
Already have an account ?Sign in

Find this document

Authors

  • Simon S. Haykin

Cite this document

Choose a citation style from the tabs below

Save time finding and organizing research with Mendeley

Sign up for free