Search the World's Largest Database of Information Science & Technology Terms & Definitions
InfInfoScipedia LogoScipedia
A Free Service of IGI Global Publishing House
Below please find a list of definitions for the term that
you selected from multiple scholarly research resources.

What is Bayesian Ying Yang Learning

Handbook of Research on Machine Learning Applications and Trends: Algorithms, Methods, and Techniques
Named in a compliment to the famous ancient Chinese Ying-Yang philosophy, it refers to a general statistical learning framework that formularizes learning tasks in a two pathway featured intelligent system via two complementary Bayesian representations of the joint distribution on the external observation and its inner representation, with all unknowns in the system determined by a principle that two Bayesian representations become best harmony. This system is called Bayesian Ying Yang system, mathematically described by and . This best harmony is mathematically implemented by maximizing called Bayesian Ying Yang harmony learning. It follows from that this best Ying Yang harmony principle includes not only a best Ying Yang matching by minimizing the Ying Yang divergence but also minimizing the entropy of the Yang machine. In other words, a best Ying Yang harmony seeks a Yang machine as an inverse of Ying machine such that it best matches the Ying machine and also keeps a least complexity. Moreover, this best Ying Yang matching provides a general perspective that unifies a number of typical statistical learning approaches.
Published in Chapter:
Learning Algorithms for RBF Functions and Subspace Based Functions
Lei Xu (Chinese University of Hong Kong and Beijing University, PR China)
DOI: 10.4018/978-1-60566-766-9.ch003
Abstract
Among extensive studies on radial basis function (RBF), one stream consists of those on normalized RBF (NRBF) and extensions. Within a probability theoretic framework, NRBF networks relates to nonparametric studies for decades in the statistics literature, and then proceeds in the machine learning studies with further advances not only to mixture-of-experts and alternatives but also to subspace based functions (SBF) and temporal extensions. These studies are linked to theoretical results adopted from studies of nonparametric statistics, and further to a general statistical learning framework called Bayesian Ying Yang harmony learning, with a unified perspective that summarizes maximum likelihood (ML) learning with the EM algorithm, RPCL learning, and BYY learning with automatic model selection, as well as their extensions for temporal modeling. This chapter outlines these advances, with a unified elaboration of their corresponding algorithms, and a discussion on possible trends.
Full Text Chapter Download: US $37.50 Add to Cart
eContent Pro Discount Banner
InfoSci OnDemandECP Editorial ServicesAGOSR