Search the World's Largest Database of Information Science & Technology Terms & Definitions
InfScipedia
A Free Service of IGI Global Publishing House
Below please find a list of definitions for the term that
you selected from multiple scholarly research resources.

# What is Principal Component Analysis (PCA)

A qualitative econometric method whereby the relatively more important one are selected out of many determinants of bank performance.
Published in Chapter:
Do Nonperforming Assets Alone Determine Bank Performance?
Rituparna Das (Faculty of Policy Science, National Law University, India)
DOI: 10.4018/978-1-4666-6551-4.ch024
Abstract
The post-crisis period in India witnessed economic slowdown consequent upon economy wide loan default in the infrastructure, real estate, and construction sectors. The asset quality problem of the Indian commercial banks became so acute that many of the weak banks were to be merged with strong banks in the interest of the depositors in order to arrest any contagion effect. The old generation private sector banks in India do not have government patronage or continuing support of the founder communities. This chapter analyzes the key financial ratios of these banks and tries to find out whether nonperforming assets are the sole determinants of the performances of these banks.
More Results
PCA performs orthogonal linear transformation, which transforms the data to a new coordinate system to reduce multidimensional data sets to lower dimensions for analysis.
It is a statistical procedure that uses an orthogonal transformation to convert a set of observations of possibly correlated variables into a set of values of linearly uncorrelated variables called principal components. The number of principal components is less than or equal to the number of original variables. This transformation is defined in such a way that the first principal component has the largest possible variance (that is, accounts for as much of the variability in the data as possible), and each succeeding component in turn has the highest variance possible under the constraint that it is orthogonal to the preceding components.
Mathematical method which transforms a number of variables (possible correlated) into a number of uncorrelated variables called principal components.
An unsupervised method for identifying variables that persevere the maximum amount of variance in a dataset. PCA is useful for different applications such as dimensionality reduction, minimizing information loss, and increasing the explainability of high-dimensional datasets.
A method used in data analysis is to refine the size of data and make the dataset effectively.
Involves a mathematical procedure that transforms a number of possibly correlated variables into a number of uncorrelated variables called principal components, related to the original variables by an orthogonal transformation
A method for achieving a dimensionality reduction. It represents a set of N-dimensional data by means of their projections onto a set of r optimally defined axes (principal components). As these axes form an orthogonal set, PCA yields a data linear transformation. Principal components represent sources of variance in the data. Thus the most significant principal components show those data features which vary the most.
Or, depending on the field of application, Karhunen–Loève transformation (KLT) or Hotelling transformation, is a method of the family of data analysis and more generally of multivariate statistics, which consists of transforming linked variables between they (known as “correlated” in statistics) into new variables uncorrelated from each other. These new variables are called “principal components” or principal axes. It allows the statistician to summarize information by reducing the number of variables. This is an approach that is both geometric (the variables being represented in a new space, along directions of maximum inertia) and statistical (the search for independent axes best explaining the variability — the variance — of the data). When we want to compress a set of N {\displaystyle N} N random variables, the first n {\displaystyle n} n axes of the principal component analysis are a better choice, from the point of view of inertia or the variance. The mathematical tool is applied in fields other than statistics and is sometimes called orthogonal eigenvalue decomposition or POD.
An orthogonal linear transformation that transforms the data to a new coordinate system preferably lower dimensional than original.
A dimensionality reduction method that transforms a set of possibly correlated variables into a new set of uncorrelated variables called principal components, each of which is a linear combination of the original variables. The first principal component has the largest possible variance; the second principal component is orthogonal to the first one and has the second largest variance, etc.
A mathematical procedure that uses an orthogonal transformation (by computing the eigenvectors and eigenvalues of the variance/covariance matrix) to convert a set of observations of possibly correlated variables into a set of values of linearly uncorrelated variables called principal components.
It refers to a statistical technique that transforms a set of possibly correlated variables into a set of linearly correlated variables called principal components.
An orthogonal linear transform used to transform data into a new coordinate system which maximises the variance captured by the first few base vectors (the principal components).
It is an algorithm to simplify the complexity in high- dimensional data while retaining trends and patterns by transforming the data into fewer dimensions, which act as summaries of features.
PCA is a popular tool for multivariate data analysis, feature extraction and data compression. Given a set of multivariate measurements, the purpose of PCA is to find a set of variables with less redundancy. The redundancy is measured by correlations between data elements
A data reduction technique used to reduce a data set down to its uncorrelated, orthogonal components.
The technique of computation the principal components and using them to modify the source of the data.
A linear transformation for separating a multivariate signal into inner components in such a way that the greatest variance by any projection of the data lies on the first component, the second greatest variance on the second coordinate, and so on.
A dimensionality reduction technique based on linear transformation for representing high-dimensionality, redundant, noisy data in a lower dimension while retaining the most variability.
A linear orthogonal transformation that transforms the data to a new coordinate system such that the new directions point to the maximal variance of multivariate data. The objectives of PCA are 1) to identify meaningful underlying variables and 2) to possibly reduce the dimensionality of the data set.
A statistical procedure that uses orthogonal transformation to convert a set of observations of possibly correlated variables into a set of values of linearly uncorrelated variables called principal components.