Search the World's Largest Database of Information Science & Technology Terms & Definitions
InfInfoScipedia LogoScipedia
A Free Service of IGI Global Publishing House
Below please find a list of definitions for the term that
you selected from multiple scholarly research resources.

What is Principal Component Analysis (PCA)

Handbook of Research on Global Business Opportunities
A qualitative econometric method whereby the relatively more important one are selected out of many determinants of bank performance.
Published in Chapter:
Do Nonperforming Assets Alone Determine Bank Performance?
Rituparna Das (Faculty of Policy Science, National Law University, India)
Copyright: © 2015 |Pages: 19
DOI: 10.4018/978-1-4666-6551-4.ch024
Abstract
The post-crisis period in India witnessed economic slowdown consequent upon economy wide loan default in the infrastructure, real estate, and construction sectors. The asset quality problem of the Indian commercial banks became so acute that many of the weak banks were to be merged with strong banks in the interest of the depositors in order to arrest any contagion effect. The old generation private sector banks in India do not have government patronage or continuing support of the founder communities. This chapter analyzes the key financial ratios of these banks and tries to find out whether nonperforming assets are the sole determinants of the performances of these banks.
Full Text Chapter Download: US $37.50 Add to Cart
More Results
Similarity Retrieval and Cluster Analysis Using R* Trees
PCA performs orthogonal linear transformation, which transforms the data to a new coordinate system to reduce multidimensional data sets to lower dimensions for analysis.
Full Text Chapter Download: US $37.50 Add to Cart
The Impact of Infrastructure on Growth in Developing Countries: Dynamic Panel Data Analysis
It is a statistical procedure that uses an orthogonal transformation to convert a set of observations of possibly correlated variables into a set of values of linearly uncorrelated variables called principal components. The number of principal components is less than or equal to the number of original variables. This transformation is defined in such a way that the first principal component has the largest possible variance (that is, accounts for as much of the variability in the data as possible), and each succeeding component in turn has the highest variance possible under the constraint that it is orthogonal to the preceding components.
Full Text Chapter Download: US $37.50 Add to Cart
Mining Spatial Patterns of Distribution of Uranium in Surface and Ground Waters in Ukraine
Mathematical method which transforms a number of variables (possible correlated) into a number of uncorrelated variables called principal components.
Full Text Chapter Download: US $37.50 Add to Cart
Explainable Safety Risk Management in Construction With Unsupervised Learning
An unsupervised method for identifying variables that persevere the maximum amount of variance in a dataset. PCA is useful for different applications such as dimensionality reduction, minimizing information loss, and increasing the explainability of high-dimensional datasets.
Full Text Chapter Download: US $37.50 Add to Cart
Airbnb (Air Bed and Breakfast) Listing Analysis Through Machine Learning Techniques
A method used in data analysis is to refine the size of data and make the dataset effectively.
Full Text Chapter Download: US $37.50 Add to Cart
Criteria That Contribute to High Quality Teaching
Involves a mathematical procedure that transforms a number of possibly correlated variables into a number of uncorrelated variables called principal components, related to the original variables by an orthogonal transformation
Full Text Chapter Download: US $37.50 Add to Cart
Automatic Classification of Impact-Echo Spectra I
A method for achieving a dimensionality reduction. It represents a set of N-dimensional data by means of their projections onto a set of r optimally defined axes (principal components). As these axes form an orthogonal set, PCA yields a data linear transformation. Principal components represent sources of variance in the data. Thus the most significant principal components show those data features which vary the most.
Full Text Chapter Download: US $37.50 Add to Cart
Principal Components Analysis: Impact on Alternative Crypto-Currencies
Or, depending on the field of application, Karhunen–Loève transformation (KLT) or Hotelling transformation, is a method of the family of data analysis and more generally of multivariate statistics, which consists of transforming linked variables between they (known as “correlated” in statistics) into new variables uncorrelated from each other. These new variables are called “principal components” or principal axes. It allows the statistician to summarize information by reducing the number of variables. This is an approach that is both geometric (the variables being represented in a new space, along directions of maximum inertia) and statistical (the search for independent axes best explaining the variability — the variance — of the data). When we want to compress a set of N {\displaystyle N} N random variables, the first n {\displaystyle n} n axes of the principal component analysis are a better choice, from the point of view of inertia or the variance. The mathematical tool is applied in fields other than statistics and is sometimes called orthogonal eigenvalue decomposition or POD.
Full Text Chapter Download: US $37.50 Add to Cart
Recent Advancements in Gabor Wavelet-Based Face Recognition
An orthogonal linear transformation that transforms the data to a new coordinate system preferably lower dimensional than original.
Full Text Chapter Download: US $37.50 Add to Cart
Class-Dependent Principal Component Analysis
A dimensionality reduction method that transforms a set of possibly correlated variables into a new set of uncorrelated variables called principal components, each of which is a linear combination of the original variables. The first principal component has the largest possible variance; the second principal component is orthogonal to the first one and has the second largest variance, etc.
Full Text Chapter Download: US $37.50 Add to Cart
An Overview for Non-Negative Matrix Factorization
A mathematical procedure that uses an orthogonal transformation (by computing the eigenvectors and eigenvalues of the variance/covariance matrix) to convert a set of observations of possibly correlated variables into a set of values of linearly uncorrelated variables called principal components.
Full Text Chapter Download: US $37.50 Add to Cart
Hybrid Intrusion Detection System for Smart Home Applications
It refers to a statistical technique that transforms a set of possibly correlated variables into a set of linearly correlated variables called principal components.
Full Text Chapter Download: US $37.50 Add to Cart
Visualisation of Large Image Databases
An orthogonal linear transform used to transform data into a new coordinate system which maximises the variance captured by the first few base vectors (the principal components).
Full Text Chapter Download: US $37.50 Add to Cart
Continuous User Authentication on Touchscreen Using Behavioral Biometrics Utilizing Machine Learning Approaches
It is an algorithm to simplify the complexity in high- dimensional data while retaining trends and patterns by transforming the data into fewer dimensions, which act as summaries of features.
Full Text Chapter Download: US $37.50 Add to Cart
Improving the Naïve Bayes Classifier
PCA is a popular tool for multivariate data analysis, feature extraction and data compression. Given a set of multivariate measurements, the purpose of PCA is to find a set of variables with less redundancy. The redundancy is measured by correlations between data elements
Full Text Chapter Download: US $37.50 Add to Cart
Measuring Relative Efficiency and Effectiveness
A data reduction technique used to reduce a data set down to its uncorrelated, orthogonal components.
Full Text Chapter Download: US $37.50 Add to Cart
Computational Thinking Self-Efficacy Perception for Progressive Learning in Malaysia: A Study of Validity and Reliability
Full Text Chapter Download: US $37.50 Add to Cart
Applying Independent Component Analysis to the Artifact Detection Problem in Magnetoencephalogram Background Recordings
A linear transformation for separating a multivariate signal into inner components in such a way that the greatest variance by any projection of the data lies on the first component, the second greatest variance on the second coordinate, and so on.
Full Text Chapter Download: US $37.50 Add to Cart
Strain Field Pattern Recognition for Structural Health Monitoring Applications
A dimensionality reduction technique based on linear transformation for representing high-dimensionality, redundant, noisy data in a lower dimension while retaining the most variability.
Full Text Chapter Download: US $37.50 Add to Cart
Significance Estimation in fMRI from Random Matrices
A linear orthogonal transformation that transforms the data to a new coordinate system such that the new directions point to the maximal variance of multivariate data. The objectives of PCA are 1) to identify meaningful underlying variables and 2) to possibly reduce the dimensionality of the data set.
Full Text Chapter Download: US $37.50 Add to Cart
Biometric: Authentication and Service to Cloud
A statistical procedure that uses orthogonal transformation to convert a set of observations of possibly correlated variables into a set of values of linearly uncorrelated variables called principal components.
Full Text Chapter Download: US $37.50 Add to Cart
Use of STEM Intervention Teaching Scenarios to Investigate Students' Attitudes Toward STEM Professions and Their Self-Evaluation of STEM Subjects
A statistical procedure that is used to identify which items or factors in a questionnaire are highly correlated with each other.
Full Text Chapter Download: US $37.50 Add to Cart
Effect of Artificial Intelligence Awareness on Job Performance with Employee Experience as a Mediating Variable
It is a statistical method that finds a new collection of variables by reducing the dimensionality of data.
Full Text Chapter Download: US $37.50 Add to Cart
Auto Associative Extreme Learning Machine Based Hybrids for Data Imputation
A very popular dimensionality reduction technique. It converts correlated variable into linearly uncorrelated variable, which will be orthogonal to each other. Each principal component is a linear combination of the original variables i.e. correlated variables. So, it is not feature selection technique but dimensionality reduction technique.
Full Text Chapter Download: US $37.50 Add to Cart
Biometric Template Security and Biometric Encryption Using Fuzzy Frameworks
A statistical procedure that uses orthogonal transformation to convert a set of observations of possibly correlated variables into a set of values of linearly uncorrelated variables called principal components. The number of principal components is less than or equal to the number of original variables.
Full Text Chapter Download: US $37.50 Add to Cart
Full Text Chapter Download: US $37.50 Add to Cart
Adaptive Neural Algorithms for PCA and ICA
An orthogonal linear transform based on singular value decomposition that projects data to a subspace that preserves maximum variance.
Full Text Chapter Download: US $37.50 Add to Cart
eContent Pro Discount Banner
InfoSci OnDemandECP Editorial ServicesAGOSR