|
|
Dimensionality reduction with adaptive graph |
Lishan QIAO1, Limei ZHANG1, Songcan CHEN2( ) |
1. Department of Mathematics Science, Liaocheng University, Liaocheng 252000, China; 2. Department of Computer Science and Engineering, Nanjing University of Aeronautics & Astronautics, Nanjing 210016, China |
|
|
Abstract Graph-based dimensionality reduction (DR) methods have been applied successfully in many practical problems, such as face recognition, where graphs play a crucial role in modeling the data distribution or structure. However, the ideal graph is, in practice, difficult to discover. Usually, one needs to construct graph empirically according to various motivations, priors, or assumptions; this is independent of the subsequent DR mapping calculation. Different from the previous works, in this paper, we attempt to learn a graph closely linked with the DR process, and propose an algorithm called dimensionality reduction with adaptive graph (DRAG), whose idea is to, during seeking projection matrix, simultaneously learn a graph in the neighborhood of a prespecified one. Moreover, the pre-specified graph is treated as a noisy observation of the ideal one, and the square Frobenius divergence is used to measure their difference in the objective function. As a result, we achieve an elegant graph update formula which naturally fuses the original and transformed data information. In particular, the optimal graph is shown to be a weighted sum of the pre-defined graph in the original space and a new graph depending on transformed space. Empirical results on several face datasets demonstrate the effectiveness of the proposed algorithm.
|
Keywords
Dimensionality reduction
graph construction
face recognition
|
Corresponding Author(s):
CHEN Songcan,Email:s.chen@nuaa.edu.cn
|
Issue Date: 01 October 2013
|
|
1 |
Duda R O, Hart P E, Stork D G. Pattern classification. Wileyinterscience , 2012
|
2 |
He X F, Niyogi P. Locality preserving projections. In: Thrun S, Saul L, Sch?lkopf B, eds. Advances in Neural Information Processing Systems 16 . Cambridge: MIT Press, 2004
|
3 |
He X F, Cai D, Yan S C, Zhang H J. Neighborhood preserving embedding. In: Proceedings of the 10th IEEE International Conference on Computer Vision . 2005, 1208-1213
|
4 |
Yan S C, Xu D, Zhang B Y, Zhang H J, Yang Q, Lin S. Graph embedding and extensions: a general framework for dimensionality reduction. IEEE Transactions on Pattern Analysis and Machine Intelligence , 2007, 29(1): 40-51 doi: 10.1109/TPAMI.2007.250598
|
5 |
Gao X, Wang X, Li X, Tao D. Transfer latent variable model based on divergence analysis. Pattern Recognition , 2011, 44(10): 2358-2366 doi: 10.1016/j.patcog.2010.06.013
|
6 |
Wang X, Gao X, Yuan Y, Tao D, Li J. Semi-supervised gaussian process latent variable model with pairwise constraints. Neurocomputing , 2010, 73(10): 2186-2195 doi: 10.1016/j.neucom.2010.01.021
|
7 |
Yan J, Zhang B, Liu N, Yan S, Cheng Q, Fan W, Yang Q, Xi W, Chen Z. Effective and efficient dimensionality reduction for large-scale and streaming data preprocessing. IEEE Transactions on Knowledge and Data Engineering , 2006, 18(3): 320-333 doi: 10.1109/TKDE.2006.45
|
8 |
Lee J A, Verleysen M. Nonlinear dimensionality reduction. Springer, 2007 doi: 10.1007/978-0-387-39351-3
|
9 |
Magdalinos P. Linear and non linear dimensionality reduction for distributed knowledge discovery. Department of Informatics, Athens University of Economics and Business, 2011
|
10 |
Tenenbaum J B, Silva V, Langford J C. A global geometric framework for nonlinear dimensionality reduction. Science , 2000, 290(5500): 2319-2323 doi: 10.1126/science.290.5500.2319
|
11 |
Roweis S T, Saul L K. Nonlinear dimensionality reduction by locally linear embedding. Science , 2000, 290(5500): 2323-2326 doi: 10.1126/science.290.5500.2323
|
12 |
Belkin M, Niyogi P. Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation , 2003, 15(6): 1373-1396 doi: 10.1162/089976603321780317
|
13 |
Van-der-Maaten L, Postma E, Van-den-Herik H. Dimensionality reduction: a comparative review. Journal of Machine Learning Research , 2009, 10: 1-41
|
14 |
He X F, Yan S C, Hu Y X, Niyogi P, Zhang H J. Face recognition using laplacianfaces. IEEE Transactions on Pattern Analysis and Machine Intelligence , 2005, 27(3): 328-340 doi: 10.1109/TPAMI.2005.55
|
15 |
Yang B, Chen S. Sample-dependent graph construction with application to dimensionality reduction. Neurocomputing , 2010, 74(1): 301-314 doi: 10.1016/j.neucom.2010.03.019
|
16 |
Samko O, Marshall A, Rosin P. Selection of the optimal parameter value for the Isomap algorithm. Pattern Recognition Letters , 2006, 27(9): 968-979 doi: 10.1016/j.patrec.2005.11.017
|
17 |
Carreira-Perpinán M A, Zemel R S. Proximity graphs for clustering and manifold learning. Advances in Neural Information Processing Systems , 2005, 17: 225-232
|
18 |
Jebara T, Wang J, Chang S F. Graph construction and b-matching for semi-supervised learning. In: Proceedings of the 26th Annual International Conference on Machine Learning . 2009, 441-448 doi: 10.1145/1553374.1553432
|
19 |
Qiao L S, Chen S C, Tan X Y. Sparsity preserving projections with applications to face recognition. Pattern Recognition , 2010, 43(1): 331-341 doi: 10.1016/j.patcog.2009.05.005
|
20 |
Yan S, Wang H. Semi-supervised learning by sparse representation. In: Proceedings of the SIAM International Conference on Data Mining (SDM2009) . 2009, 792-801
|
21 |
Elhamifar E, Vidal R. Sparse subspace clustering. In: Proceeding of the 2009 IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2009) . 2009, 2790-2797 doi: 10.1109/CVPR.2009.5206547
|
22 |
Qiao L, Zhang L, Chen S. An empirical study of two typical locality preserving linear discriminant analysis methods. Neurocomputing , 2010, 73(10): 1587-1594 doi: 10.1016/j.neucom.2009.10.022
|
23 |
Zhang L, Qiao L, Chen S. Graph-optimized locality preserving projections. Pattern Recognition , 2010, 43(6): 1993-2002 doi: 10.1016/j.patcog.2009.12.022
|
24 |
Tseng P. Convergence of a block coordinate descent method for nondifferentiable minimization. Journal of Optimization Theory and Applications , 2001, 109(3): 475-494 doi: 10.1023/A:1017501703105
|
25 |
Martinez A M,Kak A C. PCA versus LDA. IEEE Transactions on Pattern Analysis and Machine Intelligence , 2001, 23(2): 228-233 doi: 10.1109/34.908974
|
26 |
Lee K C, Ho J, Kriegman D J. Acquiring linear subspaces for face recognition under variable lighting. IEEE Transactions on Pattern Analysis and Machine Intelligence , 2005, 27(5): 684-698 doi: 10.1109/TPAMI.2005.92
|
27 |
Cai D, He X F, Han J W. Semi-supervised discriminant analysis. In: Proceedings of the 11th IEEE International Conference on Computer Vision (ICCV 2007) . 2007, 1-7
|
28 |
Wu M, Yu K, Yu S, Sch?lkopf B. Local learning projections. In: Proceedings of the 24th International Conference on Machine Learning . 2007, 1039-1046
|
29 |
Magdalinos P, Doulkeridis C, Vazirgiannis M. FEDRA: a fast and efficient dimensionality reduction algorithm. In: Proceedings of the SIAM International Conference on Data Mining (SDM2009) . 2009, 509-520
|
|
Viewed |
|
|
|
Full text
|
|
|
|
|
Abstract
|
|
|
|
|
Cited |
|
|
|
|
|
Shared |
|
|
|
|
|
Discussed |
|
|
|
|