Dimension Reduction of Local Manifold Learning Algorithm for Hyperspectral Image Classification

Dimension Reduction of Local Manifold Learning Algorithm for Hyperspectral Image Classification

Sheng Ding, Li Chen, Jun Li
DOI: 10.4018/978-1-4666-3958-4.ch009
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This chapter addresses the problems in hyperspectral image classification by the methods of local manifold learning methods. A manifold is a nonlinear low dimensional subspace that is supported by data samples. Manifolds can be exploited in developing robust feature extraction and classification methods. The manifold coordinates derived from local manifold learning (LLE, LE) methods for multiple data sets. With a proper selection of parameters and a sufficient number of features, the manifold learning methods using the k-nearest neighborhood classification results produced an efficient and accurate data representation that yields higher classification accuracies than linear dimension reduction (PCA) methods for hyperspectral image.
Chapter Preview
Top

1. Introduction

Airborne hyperspectral sensors have the capability of providing detailed measurements of the earth surface with very high-spectral resolutions. This makes them powerful when dealing with applications requiring discrimination between subtle differences in ground covers (e.g., plant type differentiation, material quantification, and target detection). However, the large dimensional data spaces (up to several hundred) generated by these sensors can result in degraded classification accuracies. This is due to the curse of dimensionality (Hughes effect) characterizing of this type of data (Bazi & Melgani, 2006; Campbell & Wynee, 2011; Melgani & Bruzzone, 2004; Ritter & Urcid, 2011). Over the past years, various solutions dealing with the classification of hyperspectral images have been proposed to improve the classification of hyperspectral remote sensing imagery (Bazi & Melgani, 2006; Melgani & Bruzzone, 2004). Dimension reduction play key role in the processing of hyperpsectral imagery classification. Traditional dimensionality reduction is important in many domains, since it mitigates the curse of dimensionality and other undesired properties of high-dimensional spaces (Jimenez & Landgrebe, 1997). As a result, dimensionality reduction facilitates classification, visualization, and compression of high-dimensional data. In general, dimensionality reduction was performed using linear techniques such as Principal Components Analysis (PCA) (Pearson, 1901), factor analysis (Spearman, 1904), and classical scaling (Torgerson, 1952). However, these linear techniques cannot adequately handle complex nonlinear data. Motivated by the lack of a systematic comparison of dimensionality reduction techniques, this paper presents a comparative study of the most important linear dimensionality reduction technique (PCA), and two nonlinear dimensionality reduction techniques. The aims of the chapter are (1) to investigate to what extent novel nonlinear dimensionality reduction techniques outperform the traditional PCA on real-world datasets and (2) to identify the inherent weaknesses of the nonlinear dimensionality reduction techniques. The investigation is performed by both a theoretical and an empirical evaluation of the dimensionality reduction techniques. The identification is performed by a careful analysis of the empirical results on a selection of real world hyperspectral datasets.

In this paper, two local manifold learning methods which are widely used in the machine learning community are investigated in a collaborative study for classification of hyperspectral data. These two local manifold learning methods (LLE, LE) are applied to two hyperspectral data sets that differ in acquisition characteristics (airborne vs. spaceborne), ground resolution, number of spectral bands, and scene characteristics. The manifold coordinates are compared to the coordinates of the original data and linear feature extraction methods using the k-nearest neighborhood classification results.

Complete Chapter List

Search this Book:
Reset