Search the World's Largest Database of Information Science & Technology Terms & Definitions
InfInfoScipedia LogoScipedia
A Free Service of IGI Global Publishing House
Below please find a list of definitions for the term that
you selected from multiple scholarly research resources.

What is Bayesian Ying-Yang System

Handbook of Research on Machine Learning Applications and Trends: Algorithms, Methods, and Techniques
A set of samples and its inner representation R in an intelligent system are jointly considered by their joint distribution in two types of Bayesian decomposition. In a compliment to the famous ancient Ying-Yang philosophy, one decomposition coincides the Yang concept with a visible domain p(X) as a Yang space and a forward pathway by as a Yang pathway. Thus, p(X, R) is called Yang machine. Also, is called Ying machine with an invisible domain q(R) as a Ying space and a backward pathway by as a Ying pathway. Such a Ying-Yang pair is called Bayesian Ying-Yang system. The input to the Ying Yang system is through directly from a training sample set , while the inner representation describes both a long term memory T that is a collection of all unknown parameters in the system and a short term memory Y with each corresponding to one element . To build up an entire system, we need to design appropriate structures for each component. Specifically, the structure of is designed subject to the nature of learning tasks and a principle of least representation redundancy, the structure of is designed to suit the mapping under a principle of divide and conquer so that a complicated mapping is realized by a number of simple ones, while the structure of is designed for an inverse map under a principle of uncertainty conversation between Ying-Yang, i.e., Yang machine preserves a room or varying range that is appropriate to accommodate uncertainty or information contained in the Ying machine.
Published in Chapter:
Learning Algorithms for RBF Functions and Subspace Based Functions
Lei Xu (Chinese University of Hong Kong and Beijing University, PR China)
DOI: 10.4018/978-1-60566-766-9.ch003
Abstract
Among extensive studies on radial basis function (RBF), one stream consists of those on normalized RBF (NRBF) and extensions. Within a probability theoretic framework, NRBF networks relates to nonparametric studies for decades in the statistics literature, and then proceeds in the machine learning studies with further advances not only to mixture-of-experts and alternatives but also to subspace based functions (SBF) and temporal extensions. These studies are linked to theoretical results adopted from studies of nonparametric statistics, and further to a general statistical learning framework called Bayesian Ying Yang harmony learning, with a unified perspective that summarizes maximum likelihood (ML) learning with the EM algorithm, RPCL learning, and BYY learning with automatic model selection, as well as their extensions for temporal modeling. This chapter outlines these advances, with a unified elaboration of their corresponding algorithms, and a discussion on possible trends.
Full Text Chapter Download: US $37.50 Add to Cart
eContent Pro Discount Banner
InfoSci OnDemandECP Editorial ServicesAGOSR