Please wait a minute...
Frontiers of Computer Science

ISSN 2095-2228

ISSN 2095-2236(Online)

CN 10-1014/TP

Postal Subscription Code 80-970

2018 Impact Factor: 1.129

Front. Comput. Sci.    2014, Vol. 8 Issue (5) : 793-806    https://doi.org/10.1007/s11704-014-3161-3
RESEARCH ARTICLE
Training SVMs on a bound vectors set based on Fisher projection
Xu YU1,*(),Jing YANG2,Zhiqiang XIE3
1. School of Information Science and Technology, Qingdao University of Science and Technology, Qingdao 266061, China
2. College of Computer Science and Technology, Harbin Engineering University, Harbin 150001, China
3. College of Computer Science and Technology, Harbin University of Science and Technology, Harbin 150080, China
 Download: PDF(565 KB)  
 Export: BibTeX | EndNote | Reference Manager | ProCite | RefWorks
Abstract

Standard support vector machines (SVMs) training algorithms have O(l3) computational and O(l2) space complexities, where l is the training set size. It is thus computationally infeasible on very large data sets. To alleviate the computational burden in SVM training, we propose an algorithm to train SVMs on a bound vectors set that is extracted based on Fisher projection. For linear separate problems, we use linear Fisher discriminant to compute the projection line, while for non-linear separate problems, we use kernel Fisher discriminant to compute the projection line. For each case, we select a certain ratio samples whose projections are adjacent to those of the other class as bound vectors. Theoretical analysis shows that the proposed algorithm is with low computational and space complexities. Extensive experiments on several classification benchmarks demonstrate the effectiveness of our approach.

Keywords support vector machines      bound vectors set      Fisher discriminant      sequential minimal optimization     
Corresponding Author(s): Xu YU   
Issue Date: 11 October 2014
 Cite this article:   
Xu YU,Jing YANG,Zhiqiang XIE. Training SVMs on a bound vectors set based on Fisher projection[J]. Front. Comput. Sci., 2014, 8(5): 793-806.
 URL:  
https://academic.hep.com.cn/fcs/EN/10.1007/s11704-014-3161-3
https://academic.hep.com.cn/fcs/EN/Y2014/V8/I5/793
1 Vapnik V N. Estimation of Dependences Based on Empirical Data. Springer-Verlag, 1982
2 Vapnik V N. The Nature of Statistical Learning Theory. Springer-Verlag, 1995
https://doi.org/10.1007/978-1-4757-2440-0
3 Li S, Wu H, Wan D, Zhu J. An effective feature selection method for hyperspectral image classification based on genetic algorithm and support vector machine. Knowledge-Based Systems, 2011, 24(1): 40-48
https://doi.org/10.1016/j.knosys.2010.07.003
4 Yang J, Yu X, Xie Z Q. A novel virtual sample generation method based on gaussian distribution. Knowledge-Based Systems, 2011, 24(6): 740-748
https://doi.org/10.1016/j.knosys.2010.12.010
5 Vamvakas G, Gatos B, Perantonis S J. Handwritten character recognition through two-stage foreground sub-sampling. Pattern Recognition, 2010, 43(8): 2807-2816
https://doi.org/10.1016/j.patcog.2010.02.018
6 Oren M, Papageorgious C, Sinha P, Osuna E, Poggio T. Pedestrian detection using wavelet templates. In: Proceedings of the 1997 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. 1997, 193-199
7 Assheton P, Hunter A. A shape-based voting algorithm for pedestrian detection and tracking. Pattern Recognition, 2011, 44(5): 1106-1120
https://doi.org/10.1016/j.patcog.2010.10.012
8 Joachims T. Text categorization with support vector machines. Technical report, University of Dortmund, 1997
9 Qin J Z, Yung N H C. Scene categorization via contextual visual words. Pattern Recognition, 2010, 43(5): 1874-1888
https://doi.org/10.1016/j.patcog.2009.11.009
10 Zhang W, Yoshida T, Tang X J. Text classification based on multi-word with support vector machine. Knowledge-Based Systems, 2008, 21(8): 879-886
https://doi.org/10.1016/j.knosys.2008.03.044
11 Sch?lkopf B, Smola A. Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. The MIT Press, 2002
12 Boser B E, Guyon I M, Vapnik V N. A training algorithm for optimal margin classifiers. In: Proceedings of the 5th AnnualWorkshop on Computational Learning Theory. 1992, 144-152
13 Osuna E, Freund R, Girosi F. An improved training algorithm for support vector machines. In: Proceedings of the IEEEWorkshop on Neural Networks for Signal Processing. 1997, 276-285
14 Platt J C. Fast training of support vector machines using sequential minimal optimization. In: Sch?lkopf B, Burges C, Smola A, eds. Advances in kernel methods-support vector learning, 185-208. MIT Press, 1999
15 Burges C J. Simplified support vector decision rules. In: Proceedings of the ICML. 1996, 71-77
16 Lee Y J, Mangasarian O L. RSVM: reduced support vector machines. In: Proceedings of the 1st SIAM International Conference on Data Mining. 2001, 5-7
17 Lin K M, Lin C J. A study on reduced support vector machines. IEEE Transactions on Neural Networks, 2003, 14(6): 1449-1459
https://doi.org/10.1109/TNN.2003.820828
18 Wu M, Sch?lkopf B, Bak?r G. A direct method for building sparse kernel learning algorithms. The Journal of Machine Learning Research, 2006, 7: 603-624
19 Kwok J Y, Tsang I H. The pre-image problem in kernel methods. IEEE Transactions on Neural Networks, 2004, 15(6): 1517-1525
https://doi.org/10.1109/TNN.2004.837781
20 Joachims T, Yu C N J. Sparse kernel SVMs via cutting-plane training. Machine Learning, 2009, 76(2-3): 179-193
https://doi.org/10.1007/s10994-009-5126-6
21 Shin H, Cho S. Pattern selection for support vector classifiers. In: Proceedings of the 3rd International Conference on Intelligent Data Engineering and Automated Learning. 2002, 469-474
22 Chen H, Yang B, Wang G, Liu J, Xu X, Wang S, Liu D. A novel bankruptcy prediction model based on an adaptive fuzzy k-nearest neighbor method. Knowledge-Based Systems, 2011, 24(8): 1348-1359
https://doi.org/10.1016/j.knosys.2011.06.008
23 Jiao L, Zhang L, Zhou W. Pre-extracting support vectors for support vector machine. Aata Electronica Sinica, 2001, 29(3): 383-386
24 Boley D, Cao D. Training support vector machine using adaptive clustering. In: Berry MW, Dayal U, Kamath C, Skillicorn D, eds. Proceedings of the 4th SIAM International Conference on Data Mining, SIAM Press, Philadelpha. 2004, 126-137
25 Joachims T. Training linear SVMs in linear time. In: Proceedings of the 12th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 2006, 217-226
https://doi.org/10.1145/1150402.1150429
26 Tsang I W, Kwok J T, Cheung P M. Core vector machines: fast SVM training on very large data sets. Journal ofMachine Learning Research, 2005, 6(4): 363-392
27 Fisher R. The use of multiple measurements in taxonomic problems. Annals of Human Genetics, 1936, 7(2): 179-188
28 Khashei M, Hamadani A, Bijari M. A fuzzy intelligent approach to the classification problem in gene expression data analysis. Knowledge-Based Systems, 2011, 27: 465-474
https://doi.org/10.1016/j.knosys.2011.10.012
29 Mika S, Ratsch G, Weston J, Scholkopf B, Mullers K. Fisher discriminant analysis with kernels. In: Proceedings of the 1999 IEEE Signal Processing Society Workshop. 1999, 41-48
30 Duda R O, Hart P E, Stork D G. Pattern Classification. 2nd ed. Wiley, 2001
31 Xu J, Zhang X, Li Y. Kernel MSE algorithm: a unified framework for kfd, ls-svm and krr. In: Proceedings of the 2001 International Joint Conference on Neural Networks. 2001, 1486-1491
32 Mika S, Smola A, Sch?lkopf B. An improved training algorithm for kernel fisher discriminants. In: Proceedings of the 14th International Conference on Artificial Intelligence and Statistics. 2001, 98-104
33 Keerthi S, Lin C. Asymptotic behaviors of support vector machines with gaussian kernel. Neural Computation, 2003, 15(7): 1667-1689
https://doi.org/10.1162/089976603321891855
34 Kohavi R. A study of cross-validation and bootstrap for accuracy estimation and model selection. In: Proceedings of the 14th Joint International Conference on Artificial Intelligence. 1995, 1137-1143
35 Zhang L. The theory of SVM and programming based learning algorithms in neural networks. Chinese Journal of Computers, 2001, 24(2): 113-118
36 Vapnik V N. Statistical Learning Theory. Wiley, 1998
37 Cortes C, Vapnik V N. Support-vector network. Machine Learning, 1995, 20(3): 273-297
https://doi.org/10.1007/BF00994018
38 Dong C Y. Study of support vector machines and its application in intrusion detection systems. PhD thesis, Xidian University, China, 2004
39 LeCun Y, Jackel L D, Bottou L, Denker J S, Drucker H, Guyon I, Müller U A, Sackinger E, Simard P, Vapnik V N. Comparison of learning algorithms for handwritten digit recognition. In: Proceedings of International Conference on Artificial Neural Network. 1995, 53-60
40 DeCoste D, Schülkopf B. Training invariant support vector machines. Machine Learning, 2002, 46(1-3): 161-190
https://doi.org/10.1023/A:1012454411458
[1] Yan LI, Shiguang SHAN, Ruiping WANG, Zhen CUI, Xilin CHEN. Fusing magnitude and phase features with multiple face models for robust face recognition[J]. Front. Comput. Sci., 2018, 12(6): 1173-1191.
[2] Lean YU, Shouyang WANG, Kin Keung LAI. Developing an SVM-based ensemble learning system for customer risk identification collaborating with customer relationship management[J]. Front Comput Sci Chin, 2010, 4(2): 196-203.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed