An Evolutionary Framework for Nonlinear Time-Series Prediction with Adaptive Gated Mixtures of Experts

An Evolutionary Framework for Nonlinear Time-Series Prediction with Adaptive Gated Mixtures of Experts

André L.V. Coelho, Clodoaldo A.M. Lima, Fernando J. Von Zuben
DOI: 10.4018/978-1-59904-249-7.ch007
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

A probabilistic learning technique, known as gated mixture of experts (MEs), is made more adaptive by employing a customized genetic algorithm based on the concepts of hierarchical mixed encoding and hybrid training. The objective of such effort is to promote the automatic design (i.e., structural configuration and parameter calibration) of whole gated ME instances more capable to cope with the intricacies of some difficult machine learning problems whose statistical properties are time-variant. In this chapter, we outline the main steps behind such novel hybrid intelligent system, focusing on its application to the nontrivial task of nonlinear time-series forecasting. Experiment results are reported with respect to three benchmarking time-series problems, and confirmed our expectation that the new integrated approach is capable to outperform, both in terms of accuracy and generalization, other conventional approaches, such as single neural networks and non-adaptive, handcrafted gated MEs.

Complete Chapter List

Search this Book:
Reset