Please wait a minute...
Frontiers of Computer Science

ISSN 2095-2228

ISSN 2095-2236(Online)

CN 10-1014/TP

Postal Subscription Code 80-970

2018 Impact Factor: 1.129

Front Comput Sci    2012, Vol. 6 Issue (2) : 143-153    https://doi.org/10.1007/s11704-012-2857-5
RESEARCH ARTICLE
Active improvement of hierarchical object features under budget constraints
Nicolas CEBRON()
Multimedia Computing Lab, University of Augsburg, 86159 Augsburg, Germany
 Download: PDF(657 KB)   HTML
 Export: BibTeX | EndNote | Reference Manager | ProCite | RefWorks
Abstract

When we think of an object in a supervised learning setting, we usually perceive it as a collection of fixed attribute values. Although this setting may be suited well for many classification tasks, we propose a new object representation and therewith a new challenge in data mining; an object is no longer described by one set of attributes but is represented in a hierarchy of attribute sets in different levels of quality. Obtaining a more detailed representation of an object comes with a cost. This raises the interesting question of which objects we want to enhance under a given budget and cost model. This new setting is very useful whenever resources like computing power, memory or time are limited. We propose a new active adaptive algorithm (AAA) to improve objects in an iterative fashion. We demonstrate how to create a hierarchical object representation and prove the effectiveness of our new selection algorithm on these datasets.

Keywords object hierarchy      machine learning      active learning     
Corresponding Author(s): CEBRON Nicolas,Email:cebron@informatik.uni-augsburg.de   
Issue Date: 01 April 2012
 Cite this article:   
Nicolas CEBRON. Active improvement of hierarchical object features under budget constraints[J]. Front Comput Sci, 2012, 6(2): 143-153.
 URL:  
https://academic.hep.com.cn/fcs/EN/10.1007/s11704-012-2857-5
https://academic.hep.com.cn/fcs/EN/Y2012/V6/I2/143
1 Rueping S, Scheffer T. Proceedings of the ICML 2005 Workshop on Learning with Multiple Views. 2005
2 Adelson E H, Anderson C H, Bergen J R, Burt P J, Ogden J M. Pyramid methods in image processing. RCA Engineer , 1984, 29(6): 33-41
3 Cohn D A, Atlas L, Ladner R E. Improving generalization with active learning. Machine Learning , 1994, 15(2): 201-221
doi: 10.1007/BF00993277
4 Zhou Z H, Li M. Semi-supervised learning by disagreement. Knowledge and Information Systems , 2010, 24(3): 415-439
doi: 10.1007/s10115-009-0209-z
5 MacKay D J C. Information-based objective functions for active data selection. Neural Computation , 1992, 4(4): 590-604
doi: 10.1162/neco.1992.4.4.590
6 Roy N, McCallum A. Toward optimal active learning through sampling estimation of error reduction. In: Proceedings of the 18th International Conference on Machine Learning . 2001, 441-448
7 Cohn D A, Ghahramani Z, Jordan M I. Active learning with statistical models. In: Proceedings of 1994 Neural Information Processing Systems . 1994, 705-712
8 Lindenbaum M, Markovitch S, Rusakov D. Selective sampling for nearest neighbor classifiers. Machine Learning , 2004, 54(2): 125-152
doi: 10.1023/B:MACH.0000011805.60520.fe
9 Freund Y, Seung S H, Shamir E, Tishby N. Selective sampling using the query by committee algorithm. Machine Learning , 1997, 28(2-3): 133-168
doi: 10.1023/A:1007330508534
10 Tong S, Koller D. Support vector machine active learning with applications to text classification. Journal of Machine Learning Research , 2001, 2: 45-66
11 Schohn G, Cohn D. Less is more: active learning with support vector machines. In: Proceedings of the 17th International Conference on Machine Learning . 2000, 839-846
12 Campbell C, Cristianini N, Smola A J. Query learning with large margin classifiers. In: Proceedings of the 17th International Conference on Machine Learning . 2000, 111-118
13 Baram Y, El-Yaniv R, Luz K. Online choice of active learning algorithms. Journal of Machine Learning Research , 2004, 5: 255-291
14 Osugi T, Kun D, Scott S. Balancing exploration and exploitation: a new algorithm for active machine learning. In:Proceedings of the 5th IEEE International Conference on Data Mining . 2005, 330-337
doi: 10.1109/ICDM.2005.33
15 Cebron N, Berthold M R. Active learning for object classification: from exploration to exploitation. Data Mining and Knowledge Discovery , 2009, 18(2): 283-299
16 Balcan M, Beygelzimer A, Langford J. Agnostic active learning. In: Proceedings of the 23rd International Conference on Machine Learning . 2006, 65-72
17 Dasgupta S, Kalai A T, Monteleoni C. Analysis of perceptron-based active learning. Journal ofMachine Learning Research , 2009, 10: 281-299
18 Zhao W, He Q, Ma H, Shi Z. Effective semi-supervised document clustering via active learning with instance-level constraints. Knowledge and Information Systems (in Press)
19 Basu S, Banerjee A, Mooney R J. Active semi-supervision for pairwise constrained clustering. In: Proceedings of the 4th SIAM International Conference on Data Mining . 2004, 333-344
20 Kapoor A, Horvitz E, Basu S. Selective supervision: guiding supervised learning with decision-theoretic active learning. In: Proceedings of the 20th International Joint Conference on Artificial Intelligence . 2007, 877-882
21 Settles B, Craven M, Friedland L. Active learning with real annotation costs. In: Proceedings of the NIPS Workshop on Cost-Sensitive Learning . 2008, 1-10
22 Zheng Z, Padmanabhan B. On active learning for data acquisition. In: Proceedings of 2002 IEEE International Conference on Data Mining . 2002, 562-569
23 Saar-Tsechansky M, Melville P, Provost F. Active feature-value acquisition. Management Science , 2009, 55(4): 664-684
24 Viola P A, Jones M J. Rapid object detection using a boosted cascade of simple features. In: Proceedings of 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition . 2001, 511-518
25 Sch?lkopf B, Burges C J C, Smola A J. Advances in Kernel Methods: Support Vector Learning. Cambridge: MIT Press, 1999
26 Abbasnejad M E, Ramachandram D, Mandava R. A survey of the state of the art in learning the kernels. Knowledge and Information Systems (in Press)
27 McCallum A, Nigam K. Employing EMand pool-based active learning for text classification. In: Proceedings of the 15th International Conference on Machine Learning . 1998, 350-358
28 Mandel M I, Poliner G E, Ellis D P W. Support vector machine active learning for music retrieval. Multimedia Systems , 2006, 12(1): 3-13
29 Wang L, Chan K L, Zhang Z. Bootstrapping SVM active learning by incorporating unlabelled images for image retrieval. In: Proceedings of 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition . 2003, 629-634
30 Luo T, Kramer K, Goldgof D B, Hall L O, Samson S, Remsen A, Hopkins T. Active learning to recognize multiple types of plankton. Journal of Machine Learning Research , 2005, 6: 589-613
31 Warmuth M K, Liao J, R?tsch G, Mathieson M, Putta S, Lemmen C. Active learning with support vector machines in the drug discovery process. Journal of Chemical Information and Computer Sciences , 2003, 43(2): 667-673
32 Cauwenberghs G, Poggio T. Incremental and decremental support vector machine learning. In: Proceedings of 2000 Neural Information Processing Systems . 2000, 409-415
33 van der Heijden F, Duin R, de Ridder D, Tax D M J. Classification, Parameter Estimation and State Estimation: An Engineering Approach Using Matlab. New York: Wiley, 2004
34 Zernike F. Diffraction theory of the cut procedure and its improved form, the phase contrast method. Physica , 1934, 1: 689-704
35 Asuncion A, Newman D J. UCI Machine Learning Repository, 2007
[1] Xia-an BI, Yiming XIE, Hao WU, Luyun XU. Identification of differential brain regions in MCI progression via clustering-evolutionary weighted SVM ensemble algorithm[J]. Front. Comput. Sci., 2021, 15(6): 156903-.
[2] Yan-Ping SUN, Min-Ling ZHANG. Compositional metric learning for multi-label classification[J]. Front. Comput. Sci., 2021, 15(5): 155320-.
[3] Jian SUN, Pu-Feng DU. Predicting protein subchloroplast locations: the 10th anniversary[J]. Front. Comput. Sci., 2021, 15(2): 152901-.
[4] Syed Farooq ALI, Muhammad Aamir KHAN, Ahmed Sohail ASLAM. Fingerprint matching, spoof and liveness detection: classification and literature review[J]. Front. Comput. Sci., 2021, 15(1): 151310-.
[5] Xu-Ying LIU, Sheng-Tao WANG, Min-Ling ZHANG. Transfer synthetic over-sampling for class-imbalance learning with limited minority class data[J]. Front. Comput. Sci., 2019, 13(5): 996-1009.
[6] Yu-Feng LI, De-Ming LIANG. Safe semi-supervised learning: a brief introduction[J]. Front. Comput. Sci., 2019, 13(4): 669-676.
[7] Wenhao ZHENG, Hongyu ZHOU, Ming LI, Jianxin WU. CodeAttention: translating source code to comments by exploiting the code constructs[J]. Front. Comput. Sci., 2019, 13(3): 565-578.
[8] Hao SHAO. Query by diverse committee in transfer active learning[J]. Front. Comput. Sci., 2019, 13(2): 280-291.
[9] Qingying SUN, Zhongqing WANG, Shoushan LI, Qiaoming ZHU, Guodong ZHOU. Stance detection via sentiment information and neural network model[J]. Front. Comput. Sci., 2019, 13(1): 127-138.
[10] Ruochen HUANG, Xin WEI, Liang ZHOU, Chaoping LV, Hao MENG, Jiefeng JIN. A survey of data-driven approach on multimedia QoE evaluation[J]. Front. Comput. Sci., 2018, 12(6): 1060-1075.
[11] Qiang LV, Yixin CHEN, Zhaorong LI, Zhicheng CUI, Ling CHEN, Xing ZHANG, Haihua SHEN. Achieving data-driven actionability by combining learning and planning[J]. Front. Comput. Sci., 2018, 12(5): 939-949.
[12] Ashish Kumar DWIVEDI, Anand TIRKEY, Santanu Kumar RATH. Software design pattern mining using classification-based techniques[J]. Front. Comput. Sci., 2018, 12(5): 908-922.
[13] Bo SUN, Haiyan CHEN, Jiandong WANG, Hua XIE. Evolutionary under-sampling based bagging ensemble method for imbalanced data classification[J]. Front. Comput. Sci., 2018, 12(2): 331-350.
[14] Min-Ling ZHANG, Yu-Kun LI, Xu-Ying LIU, Xin GENG. Binary relevance for multi-label learning: an overview[J]. Front. Comput. Sci., 2018, 12(2): 191-202.
[15] Zhongqing WANG, Shoushan LI, Guodong ZHOU. Personal summarization from profile networks[J]. Front. Comput. Sci., 2017, 11(6): 1085-1097.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed