CFP last date
20 May 2024
Reseach Article

Extensions and Analysis of Local Non-linear Techniques

by Rashmi Gupta, Rajiv Kapoor
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 51 - Number 13
Year of Publication: 2012
Authors: Rashmi Gupta, Rajiv Kapoor
10.5120/8099-1687

Rashmi Gupta, Rajiv Kapoor . Extensions and Analysis of Local Non-linear Techniques. International Journal of Computer Applications. 51, 13 ( August 2012), 1-6. DOI=10.5120/8099-1687

@article{ 10.5120/8099-1687,
author = { Rashmi Gupta, Rajiv Kapoor },
title = { Extensions and Analysis of Local Non-linear Techniques },
journal = { International Journal of Computer Applications },
issue_date = { August 2012 },
volume = { 51 },
number = { 13 },
month = { August },
year = { 2012 },
issn = { 0975-8887 },
pages = { 1-6 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume51/number13/8099-1687/ },
doi = { 10.5120/8099-1687 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T20:50:16.238215+05:30
%A Rashmi Gupta
%A Rajiv Kapoor
%T Extensions and Analysis of Local Non-linear Techniques
%J International Journal of Computer Applications
%@ 0975-8887
%V 51
%N 13
%P 1-6
%D 2012
%I Foundation of Computer Science (FCS), NY, USA
Abstract

The techniques Conformal Eigenmap and Neighborhood Preserving Embedding (NPE) have been proposed as extensions of local non-linear techniques. Many of the commonly used non-linear dimensionality reduction, such as Local Linear Embedding (LLE) and Laplacian eigenmap are not explicitly designed to preserve local features such as distances or angles. In first proposed Conformal Eigenmap technique, a low dimensional embedding is constructed that maximally preserves angles between nearby data points. The embedding is derived from the bottom eigenvectors of LLE by solving an additional problem in Semidefinite Programming (SDP). In second proposed method, NPE minimizes the cost function of a local nonlinear technique for dimensionality reduction under the constraint that the mapping from the high-dimensional to the low-dimensional data representation is linear. The idea is to modify the LLE by introducing a linear transform matrix. The effectiveness of the proposed methods is demonstrated on synthetic datasets. Experimental results on several data sets demonstrate the merits of proposed techniques.

References
  1. Jain, A. K. , Duin, R. P. W. , and Mao, J. 2000. Statistical Pattern Recognition: A Review, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 22, no. 1, 4-37.
  2. Jimenez, L. O. and Landgrebe, D. A. 1997. Supervised classification in high-dimensional space: geometrical, statistical, and asymptotical properties of multivariate data, IEEE Transactions on Systems, Man and Cybernetics, vol. 28, no. 1, 39–54.
  3. Belhumeur, P. N. , Hepanha, J. P. , and Kriegman, D. J. 1997. Eigenfaces vs. fisherfaces: recognition using class specific linear projection, IEEE. Trans. Pattern Analysis and Machine Intelligence, vol. 19, no. 7, 711-720.
  4. Torgerson, W. S. 1952. Multidimensional scaling I: Theory and method, Psychometrika, vol. 17, 401–419.
  5. Burges, C. J. C. 2005. Data Mining and Knowledge Discovery Handbook: A Complete Guide for Practitioners and Researchers, chapter Geometric Methods for Feature Selection and Dimensional Reduction: A Guided Tour. Kluwer: Academic Publishers.
  6. Saul, L. K. , Weinberger, K. Q. , Ham, J. H. , F. Sha, and Lee, D. D 2006. Spectral methods for dimensionality reduction, In Semisupervised Learning, Cambridge, MA: The MIT Press.
  7. Maaten, L. J. P. , Postma, E. O. , and Herik, H. J. 2008. Dimensionality reduction: A comparative review.
  8. Scholkopf, B. , Smola, A. , and Muller, K. R. , 1998. Nonlinear Component Analysis as a Kernel Eigenvalue Problem" Neural Computation, vol. 10, no. 5, 1299-1319.
  9. Baudat, G. and Anouar, F. 2000. Generalized discriminant analysis using a kernel approach, Neural Computation, vol. 12, 2385-2404.
  10. He, X. and Niyogi, P. 2003. Locality Preserving Projections, Advances in Neural Information Processing Systems16, Vancouver, British Columbia, Canada.
  11. Roweis S. T. and Saul, L. K. 2000. Nonlinear dimensionality reduction by Locally Linear Embedding, Science, vol. 290, no. 5500, 2323–2326.
  12. Tenenbaum, J. B. , Silva, V. de and J. C. Langford, 2000. A global geometric framework for nonlinear dimensionality reduction, Science, vol. 290, no. 5500, 2319–2323.
  13. Belkin, M. and Niyogi, P. 2000. Laplacian Eigenmaps and spectral techniques for embedding and clustering, In Advances in Neural Information Processing Systems, vol. 14, 585–591, Cambridge, MA: The MIT Press.
  14. Bengio, Y. , Paiement, J. , Vincent, P. , Dellallaeu, O. , Roux, N. L, Quimet, M. 2003. Out-of sample Extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering, Neural Information Processing Systems.
  15. Silva, V. de and Tenenbaum, J. B. 2003. Global versus local methods in nonlinear dimensionality reduction," Advances in Neural Information Processing Systems 15, 721–728, Cambridge, MA: MIT Press.
  16. Donoho, D. L. and Grimes, C. E. 2002. When does Isomap recover the natural parameterization of families of articulated images? (Technical Report 2002-27). Department of Statistics: Stanford University.
  17. Donoho, D. L. and Grimes, C. E. 2003. Hessian eigenmaps: locally linear embedding techniques for high dimensional data, Proceedings of the National Academy of Arts and Sciences, 100, 5591–5596.
  18. Sha, F. and Saul, L. K. 2005. Analysis and Extensions of Spectral Methods for Nonlinear Dimensionality Reduction, Proceedings of twenty second International Conference on Machine Learning, Germany.
  19. Vandenberghe, L. and Boyd, S. 1996. Semidefinite programming. SIAM Review, vol. 38, no. 1, 49–95.
  20. . He, X. , Cai, D. , Yan, S. , and Zhang, H. 2005. J Neighborhood preserving embedding, Proceedings of Tenth International Conference on Computer Vision, Piscataway: IEEE Press, 1208-1213.
Index Terms

Computer Science
Information Sciences

Keywords

Dimension reduction Manifold Learning Conformal Eigenmap Neighborhood Preserving Embedding Local Linear Embedding Laplacian Eigenmap