For the reason of all parameters in a conventional fuzzy neural network (FNN) needed to be adjusted iteratively, learning can be very slow and may suffer from local minima. To overcome these problems, we propose a novel FNN in this paper, which shows a fast speed and accurate generalization. First we state the universal approximation theorem for an FNN with random membership function parameters (FNN-RM). Since all the membership function parameters are arbitrarily chosen, the proposed FNN-RM algorithm needs to adjust only the output weights of FNNs. Experimental results on function approximation and classification problems show that the new algorithm not only provides thousands of times of speed-up over traditional learning algorithms, but also produces better generalization performance in comparison to other FNNs. © Springer-Verlag 2004.
CITATION STYLE
Wang, L., Liu, B., & Wan, C. (2004). A novel fuzzy neural network with fast training and accurate generalization. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 3173, 270–275. https://doi.org/10.1007/978-3-540-28647-9_46
Mendeley helps you to discover research relevant for your work.