A novel fuzzy neural network with fast training and accurate generalization

4Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

For the reason of all parameters in a conventional fuzzy neural network (FNN) needed to be adjusted iteratively, learning can be very slow and may suffer from local minima. To overcome these problems, we propose a novel FNN in this paper, which shows a fast speed and accurate generalization. First we state the universal approximation theorem for an FNN with random membership function parameters (FNN-RM). Since all the membership function parameters are arbitrarily chosen, the proposed FNN-RM algorithm needs to adjust only the output weights of FNNs. Experimental results on function approximation and classification problems show that the new algorithm not only provides thousands of times of speed-up over traditional learning algorithms, but also produces better generalization performance in comparison to other FNNs. © Springer-Verlag 2004.

Cite

CITATION STYLE

APA

Wang, L., Liu, B., & Wan, C. (2004). A novel fuzzy neural network with fast training and accurate generalization. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 3173, 270–275. https://doi.org/10.1007/978-3-540-28647-9_46

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free