Search the World's Largest Database of Information Science & Technology Terms & Definitions
InfInfoScipedia LogoScipedia
A Free Service of IGI Global Publishing House
Below please find a list of definitions for the term that
you selected from multiple scholarly research resources.

What is Self-Consistent Approximation

Handbook of Research on Machine Learning Applications and Trends: Algorithms, Methods, and Techniques
Approximation of a dataset by a set of vectors such that every point y in the vector set is a conditional mean of all points from dataset that are projected in y.
Published in Chapter:
Principal Graphs and Manifolds
Alexander N. Gorban (University of Leicester, UK) and Andrei Y. Zinovyev (Institut Curie, France)
DOI: 10.4018/978-1-60566-766-9.ch002
Abstract
In many physical, statistical, biological and other investigations it is desirable to approximate a system of points by objects of lower dimension and/or complexity. For this purpose, Karl Pearson invented principal component analysis in 1901 and found ‘lines and planes of closest fit to system of points’. The famous k-means algorithm solves the approximation problem too, but by finite sets instead of lines and planes. This chapter gives a brief practical introduction into the methods of construction of general principal objects (i.e., objects embedded in the ‘middle’ of the multidimensional data set). As a basis, the unifying framework of mean squared distance approximation of finite datasets is selected. Principal graphs and manifolds are constructed as generalisations of principal components and k-means principal points. For this purpose, the family of expectation/maximisation algorithms with nearest generalisations is presented. Construction of principal graphs with controlled complexity is based on the graph grammar approach.
Full Text Chapter Download: US $37.50 Add to Cart
eContent Pro Discount Banner
InfoSci OnDemandECP Editorial ServicesAGOSR