Please wait a minute...
Frontiers of Electrical and Electronic Engineering

ISSN 2095-2732

ISSN 2095-2740(Online)

CN 10-1028/TM

Front Elect Electr Eng Chin    2011, Vol. 6 Issue (1) : 27-42    https://doi.org/10.1007/s11460-011-0124-4
RESEARCH ARTICLE
Tangent space learning and generalization
Xiaofei HE(), Binbin LIN
State Key Lab of CAD & CG, College of Computer Science, Zhejiang University, Hangzhou 310058, China
 Download: PDF(781 KB)   HTML
 Export: BibTeX | EndNote | Reference Manager | ProCite | RefWorks
Abstract

Manifold learning has attracted considerable attention over the last decade, in which exploring the geometry and topology of the manifold is the central problem. Tangent space is a fundamental tool in discovering the geometry of the manifold. In this paper, we will first review canonical manifold learning techniques and then discuss two fundamental problems in tangent space learning. One is how to estimate the tangent space from random samples, and the other is how to generalize tangent space to ambient space. Previous studies in tangent space learning have mainly focused on how to fit tangent space, and one has to solve a global equation for obtaining the tangent spaces. Unlike these approaches, we introduce a novel method, called persistent tangent space learning (PTSL), which estimates the tangent space at each local neighborhood while ensuring that the tangent spaces vary smoothly on the manifold. Tangent space can be viewed as a point on Grassmann manifold. Inspired from the statistics on Grassmann manifold, we use intrinsic sample total variance to measure the variation of estimated tangent spaces at a single point, and thus, the generalization problem can be solved by estimating the intrinsic sample mean on Grassmann manifold. We validate our methods by various experimental results both on synthetic and real data.

Keywords tangent space learning      machine learning      manifold learning     
Corresponding Author(s): HE Xiaofei,Email:xiaofeihe@cad.zju.edu.cn   
Issue Date: 05 March 2011
 Cite this article:   
Xiaofei HE,Binbin LIN. Tangent space learning and generalization[J]. Front Elect Electr Eng Chin, 2011, 6(1): 27-42.
 URL:  
https://academic.hep.com.cn/fee/EN/10.1007/s11460-011-0124-4
https://academic.hep.com.cn/fee/EN/Y2011/V6/I1/27
1 Pearson K. On lines and planes of closest fit to systems of points in space. Philosophical Magazine , 1901, 2(6): 559-572
2 Sch?lkopf B, Smola A, Müller K R. Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation , 1998, 10(5): 1299-1319
doi: 10.1162/089976698300017467
3 Ham J, Lee D D, Mika S, Sch?lkopf B. A kernel view of the dimensionality reduction of manifolds. In: Proceedings of the 21st International Conference on Machine Learning . 2004, 369-276
4 Tenenbaum J, de Silva V, Langford J. A global geometric framework for nonlinear dimensionality reduction. Science , 2000, 290(5500): 2319-2323
doi: 10.1126/science.290.5500.2319
5 Roweis S, Saul L. Nonlinear dimensionality reduction by locally linear embedding. Science , 2000, 290(5500): 2323-2326
doi: 10.1126/science.290.5500.2323
6 Belkin M, Niyogi P. Laplacian eigenmaps and spectral techniques for embedding and clustering. In: Proceedings of Advances in Neural Information Processing Systems . 2001, 14: 585-591
7 de Silva V, Tenenbaum J B. Global versus local methods in nonlinear dimensionality reduction. In: Becker S, Thrun S, Obermayer K, eds. Proceedings of Advances in Neural Information Processing Systems . 2002, 15: 721-728
8 Zhang Z, Wang J. MLLE: Modified locally linear embedding using multiple weights. In: Sch?lkopf B, Platt J, Hoffman T, eds. Proceedings of Advances in Neural Information Processing Systems . 2007, 19: 1593-1600
9 Weinberger K Q, Sha F, Saul L K. Learning a kernel matrix for nonlinear dimensionality reduction. In: Proceedings of the 21th International Conference on Machine Learning . 2004, 106-113
10 Coifman R R, Lafon S, Lee A B, Maggioni M, Warner F, Zucker S. Geometric diffusions as a tool for harmonic analysis and structure definition of data: Diffusion maps. In: Proceedings of the National Academy of Sciences . 2005, 7426-7431
11 Chung F R K. Spectral Graph Theory. Regional Conference Series in Mathematics Vol 92 . Providence: American Mathematical Society, 1997
12 Bengio Y, Paiement J, Vincent P, Delalleau O, Roux N L, Ouimet M. Out-of-sample extensions for LLE, Isomap, MDS, eigenmaps, and spectral clustering. In: Thrun S, Saul L, Sch?lkopf B, eds. Proceedings of Advances in Neural Information Processing Systems . 2004, 16: 177-184
13 Belkin M, Niyogi P. Convergence of Laplacian eigenmaps. In: Sch?llkopf B, Platt J, Hoffman T, eds. Proceedings of Advances in Neural Information Processing Systems . Cambridge: MIT Press, 2007, 19: 129-136
14 Belkin M, Niyogi P, Sindhwani V. Manifold regularization: a geometric framework for learning from labeled and unlabeled examples. Journal of Machine Learning Research , 2006, 7: 2399-2434
15 He X, Ji M, Bao H. A unified active and semi-supervised learning framework for image compression. In: Proceedings of IEEE International Conference on Computer Vision and Pattern Recognition . 2009, 65-72
16 He X. Laplacian regularized d-optimal design for active learning and its application to image retrieval. IEEE Transactions on Image Processing , 2010, 19(1): 254-263
doi: 10.1109/TIP.2009.2032342
17 He X, Niyogi P. Locality preserving projections. In: Proceedings of Advances in Neural Information Processing Systems . 2003, 153-160
18 He X, Yan S, Hu Y, Niyogi P, Zhang H J. Face recognition using Laplacianfaces. IEEE Transactions on Pattern Analysis and Machine Intelligence , 2005, 27(3): 328-340
doi: 10.1109/TPAMI.2005.55
19 He X, Cai D, Niyogi P. Tensor subspace analysis. In: Proceedings of Advances in Neural Information Processing Systems . 2005, 18: 499-506
20 Donoho D L, Grimes C. Hessian eigenmaps: locally linear embedding techniques for high-dimensional data. Proceedings of the National Academy of Sciences , 2003, 100(10): 5591-5596
doi: 10.1073/pnas.1031596100
21 Steinke F, Hein M. Non-parametric regression between manifolds. In: Koller D, Schuurmans D, Bengio Y, Bottou L, eds. Proceedings of Advances in Neural Information Processing Systems . 2008, 21: 1561-1568
22 Kim K I, Steinke F, Hein M. Semi-supervised regression using hessian energy with an application to semi-supervised dimensionality reduction. In: Bengio Y, Schuurmans D, Lafferty J, Williams C K I, Culotta A, eds. Advances in Neural Information Processing Systems . 2009, 22: 979-987
23 Zhang Z, Zha H. Principal manifolds and nonlinear dimension reduction via local tangent space alignment. SIAM Journal of Scientific Computing , 2004, 26(1): 313-338
doi: 10.1137/S1064827502419154
24 Brand M. Charting a manifold. In: Proceedings of Advances in Neural Information Processing Systems . 2003, 15: 961-968
25 Lin T, Zha H. Riemannian manifold learning. IEEE Transactions on Pattern Analysis and Machine Intelligence , 2008, 30(5): 796-809
doi: 10.1109/TPAMI.2007.70735
26 Dollár P, Rabaud V, Belongie S. Non-isometric manifold learning: analysis and an algorithm. In: Proceedings of the 24th International Conference on Machine learning . 2007, 227: 241-248
27 Dollár P, Belongie S, Rabaud V. Learning to traverse image manifolds. In: Sch?lkopf B, Platt J, Hoffman T, eds. Proceedings of Advances in Neural Information Processing Systems . 2007, 19: 361-368
28 Zomorodian A, Carlsson G. Computing persistent homology. Discrete and Computational Geometry , 2005, 33(2): 249-274
doi: 10.1007/s00454-004-1146-y
29 Niyogi P, Smale S, Weinberger S. Finding the homology of submanifolds with high confidence from random samples. Discrete and Computational Geometry , 2008, 39(1): 419-441
doi: 10.1007/s00454-008-9053-2
30 Eells J, Lemaire L. Selected Topics in Harmonic Maps. CBMS Regional Conference Series in Mathematics, Vol 50 . Providence: American Mathematical Society, 1983
31 Bengio Y, Monperrus M. Non-local manifold tangent learning. In: Proceedings of Advances in Neural Information Processing Systems . 2005, 17: 129-136
32 Cai D, He X, Han J. Document clustering using locality preserving indexing. IEEE Transactions on Knowledge and Data Engineering , 2005, 17(12): 1624-1637
doi: 10.1109/TKDE.2005.198
33 Lin B, He X, Zhou Y, Liu L, Lu K. Approximately harmonic projection: theoretical analysis and an algorithm. Pattern Recognition , 2010, 43(10): 3307-3313
doi: 10.1016/j.patcog.2010.05.011
34 Turk M, Pentland A. Eigenfaces for recognition. Journal of Cognitive Neuroscience , 1991, 3(1): 71-86
doi: 10.1162/jocn.1991.3.1.71
35 Belhumeur P N, Hepanha J P, Kriegman D J. Eigenfaces vs. Fisherfaces: recognition using class specific linear projection. IEEE Transactions on Pattern Analysis and Machine Intelligence , 1997, 19(7): 711-720
doi: 10.1109/34.598228
36 Atkinson A C, Donev A N. Optimum Experimental Designs. Oxford: Oxford University Press, 2007
37 Cheng L, Vishwanathan S V N. Learning to compress images and videos. In: Proceedings of the 24th International Conference on Machine Learning . 2007, 161-168
doi: 10.1145/1273496.1273517
38 Achlioptas D. Random matrices in data analysis. In: Proceedings of the 15th European Conference on Machine Learning . 2004, 1-8
39 Ham J, Lee D D. Grassmann discriminant analysis: a unifying view on subspacebased learning. In: Proceedings of the 25th International Conference on Machine Learning . 2008, 376-383
40 Bhattacharya R, Patrangenaru V. Nonparametic estimation of location and dispersion on riemannian manifolds. Journal of Statistical Planning and Inference , 2002,108: 23-36
doi: 10.1016/S0378-3758(02)00268-9
41 Eckart C, Young G. The approximation of one matrix by another of lower rank. Psychometrika , 1936, 1(3): 211-218
doi: 10.1007/BF02288367
[1] Lei XU. On essential topics of BYY harmony learning: Current status, challenging issues, and gene analysis applications[J]. Front Elect Electr Eng, 2012, 7(1): 147-196.
[2] Chen SONG, Xiaohong GUAN, Qianchuan ZHAO, Qing-Shan JIA. Remanufacturing planning based on constrained ordinal optimization[J]. Front Elect Electr Eng Chin, 2011, 6(3): 443-452.
[3] Lei XU. Codimensional matrix pairing perspective of BYY harmony learning: hierarchy of bilinear systems, joint decomposition of data-covariance, and applications of network biology[J]. Front Elect Electr Eng Chin, 2011, 6(1): 86-119.
[4] Zhi-Hua Zhou. When semi-supervised learning meets ensemble learning[J]. Front Elect Electr Eng Chin, 2011, 6(1): 6-16.
[5] Shun-ichi AMARI, . Information geometry in optimization, machine learning and statistical inference[J]. Front. Electr. Electron. Eng., 2010, 5(3): 241-260.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed