Reference Hub2
Faster Self-Organizing Fuzzy Neural Network Training and Improved Autonomy with Time-Delayed Synapses for Locally Recurrent Learning

Faster Self-Organizing Fuzzy Neural Network Training and Improved Autonomy with Time-Delayed Synapses for Locally Recurrent Learning

Damien Coyle, Girijesh Prasad, Martin McGinnity
ISBN13: 9781609600181|ISBN10: 1609600185|ISBN13 Softcover: 9781609600198|EISBN13: 9781609600204
DOI: 10.4018/978-1-60960-018-1.ch008
Cite Chapter Cite Chapter

MLA

Coyle, Damien, et al. "Faster Self-Organizing Fuzzy Neural Network Training and Improved Autonomy with Time-Delayed Synapses for Locally Recurrent Learning." System and Circuit Design for Biologically-Inspired Intelligent Learning, edited by Turgay Temel, IGI Global, 2011, pp. 156-183. https://doi.org/10.4018/978-1-60960-018-1.ch008

APA

Coyle, D., Prasad, G., & McGinnity, M. (2011). Faster Self-Organizing Fuzzy Neural Network Training and Improved Autonomy with Time-Delayed Synapses for Locally Recurrent Learning. In T. Temel (Ed.), System and Circuit Design for Biologically-Inspired Intelligent Learning (pp. 156-183). IGI Global. https://doi.org/10.4018/978-1-60960-018-1.ch008

Chicago

Coyle, Damien, Girijesh Prasad, and Martin McGinnity. "Faster Self-Organizing Fuzzy Neural Network Training and Improved Autonomy with Time-Delayed Synapses for Locally Recurrent Learning." In System and Circuit Design for Biologically-Inspired Intelligent Learning, edited by Turgay Temel, 156-183. Hershey, PA: IGI Global, 2011. https://doi.org/10.4018/978-1-60960-018-1.ch008

Export Reference

Mendeley
Favorite

Abstract

This chapter describes a number of modifications to the learning algorithm and architecture of the self-organizing fuzzy neural network (SOFNN) to improve its computational efficiency and learning ability. To improve the SOFNN’s computational efficiency, a new method of checking the network structure after it has been modified is proposed. Instead of testing the entire structure every time it has been modified, a record is kept of each neuron’s firing strength for all data previously clustered by the network. This record is updated as training progresses and is used to reduce the computational load of checking network structure changes, to ensure performance degradation does not occur, resulting in significantly reduced training times. It is shown that the modified SOFNN compares favorably to other evolving fuzzy systems in terms of accuracy and structural complexity. In addition, a new architecture of the SOFNN is proposed where recurrent feedback connections are added to neurons in layer three of the structure. Recurrent connections allow the network to learn the temporal information from data and, in contrast to pure feed forward architectures which exhibit static input-output behavior in advance, recurrent models are able to store information from the past (e.g., past measurements of the time-series) and are therefore better suited to analyzing dynamic systems. Each recurrent feedback connection includes a weight which must be learned. In this work a learning approach is proposed where the recurrent feedback weight is updated online (not iteratively) and proportional to the aggregate firing activity of each fuzzy neuron. It is shown that this modification can significantly improve the performance of the SOFNN’s prediction capacity under certain constraints.

Request Access

You do not own this content. Please login to recommend this title to your institution's librarian or purchase it from the IGI Global bookstore.