A Classification Algorithm Based on Improved Locally Linear Embedding

A Classification Algorithm Based on Improved Locally Linear Embedding

Hui Wang, Tie Cai, Dongsheng Cheng, Kangshun Li, Ying Zhou
DOI: 10.4018/IJCINI.344020
Article PDF Download
Open access articles are freely available for download

Abstract

The current classification is difficult to overcome the high-dimension classification problems. So, we will design the decreasing dimension method. Locally linear embedding is that the local optimum gradually approaches the global optimum, especially the complicated manifold learning problem used in big data dimensionality reduction needs to find an optimization method to adjust k-nearest neighbors and extract dimensionality. Therefore, we intend to use orthogonal mapping to find the optimization closest neighbors k, and the design is based on the Lebesgue measure constraint processing technology particle swarm locally linear embedding to improve the calculation accuracy of popular learning algorithms. So, we propose classification algorithm based on improved locally linear embedding. The experiment results show that the performance of proposed classification algorithm is best compared with the other algorithm.
Article Preview
Top

The Improved Locally Linear Embedding

The most basic research object of topology is topological space. The so-called topological space is a set, which is given a topological structure, and a continuous mapping can be defined between topological spaces. The classical manifold learning algorithms include isometric feature mapping (LLE), locally linear embedding (LLE), and laplacian eigenmaps (LEM).

Complete Article List

Search this Journal:
Reset
Volume 18: 1 Issue (2024)
Volume 17: 1 Issue (2023)
Volume 16: 1 Issue (2022)
Volume 15: 4 Issues (2021)
Volume 14: 4 Issues (2020)
Volume 13: 4 Issues (2019)
Volume 12: 4 Issues (2018)
Volume 11: 4 Issues (2017)
Volume 10: 4 Issues (2016)
Volume 9: 4 Issues (2015)
Volume 8: 4 Issues (2014)
Volume 7: 4 Issues (2013)
Volume 6: 4 Issues (2012)
Volume 5: 4 Issues (2011)
Volume 4: 4 Issues (2010)
Volume 3: 4 Issues (2009)
Volume 2: 4 Issues (2008)
Volume 1: 4 Issues (2007)
View Complete Journal Contents Listing