Please wait a minute...
Frontiers of Computer Science

ISSN 2095-2228

ISSN 2095-2236(Online)

CN 10-1014/TP

Postal Subscription Code 80-970

2018 Impact Factor: 1.129

Front. Comput. Sci.    2025, Vol. 19 Issue (2) : 192307    https://doi.org/10.1007/s11704-023-3228-0
Artificial Intelligence
Nonconvex and discriminative transfer subspace learning for unsupervised domain adaptation
Yueying LIU, Tingjin LUO()
College of Science, National University of Defense Technology, Changsha 410073, China
 Download: PDF(15936 KB)   HTML
 Export: BibTeX | EndNote | Reference Manager | ProCite | RefWorks
Abstract

Unsupervised transfer subspace learning is one of the challenging and important topics in domain adaptation, which aims to classify unlabeled target data by using source domain information. The traditional transfer subspace learning methods often impose low-rank constraints, i.e., trace norm, to preserve data structural information of different domains. However, trace norm is only the convex surrogate to approximate the ideal low-rank constraints and may make their solutions seriously deviate from the original optimums. In addition, the traditional methods directly use the strict labels of source domain, which is difficult to deal with label noise. To solve these problems, we propose a novel nonconvex and discriminative transfer subspace learning method named NDTSL by incorporating Schatten-p norm and soft label matrix. Specifically, Schatten-p norm can be imposed to approximate the low-rank constraints and obtain a better low-rank representation. Then, we design and adopt soft label matrix in source domain to learn a more flexible classifier and enhance the discriminative ability of target data. Besides, due to the nonconvexity of Schatten-p norm, we design an efficient alternative algorithm IALM to solve it. Finally, experimental results on several public transfer tasks demonstrate the effectiveness of NDTSL compared with several state-of-the-art methods.

Keywords transfer subspace learning      unsupervised domain adaptation      low-rank modeling      nonconvex optimization     
Corresponding Author(s): Tingjin LUO   
About author: Li Liu and Yanqing Liu contributed equally to this work.
Just Accepted Date: 19 December 2023   Issue Date: 22 April 2024
 Cite this article:   
Yueying LIU,Tingjin LUO. Nonconvex and discriminative transfer subspace learning for unsupervised domain adaptation[J]. Front. Comput. Sci., 2025, 19(2): 192307.
 URL:  
https://academic.hep.com.cn/fcs/EN/10.1007/s11704-023-3228-0
https://academic.hep.com.cn/fcs/EN/Y2025/V19/I2/192307
Fig.1  A framework of NDTSL method. Schatten-p norm is used to impose a low-rank constraint on Z for preserving the original data structure; l1 norm is adopted to make noise matrix E sparsity and ensure the robustness of our model
Description Formula
Frobenius norm AF=iσi(A)2
l1 norm A1=i,j|Aij|
Trace norm A=iσi(A)
Schatten-p norm ASpp=i=1min{m,n}σip(A)
Tab.1  The formula of various norm
  
Dataset Sample Feature Class Subspace
Office 1410 800 10 A, W, D
Caltech-256 1123 800 10 C
VOC2007 3376 4096 5 V
LabelMe 2656 4096 5 L
SUN09 3282 4096 5 S
Tab.2  Basic description of datasets
Dataset Measures NN TCA JDA W-BDA JPDA GFK DTSL FSSP JLRFS PAS LRRCA NDTSL
Accuracy .3523± .0180? .3752± .0297? .3747± .0290? .4200± .0302? .3911± .0302? .3833± .0235? .5189± .0204? .2660± .0207? .4894± .0178? .4512± .0251? .5193± .0205? .5278± .0190
CA Precision .3465± .0246? .3862± .0313? .3792± .0282? .4139± .0301? .3971± .0317? .4040± .0208? .5307± .0235? .2872± .0240? .4945± .0197? .4923± .0280? .5302± .0246? .5404± .0233
Recall .3563± .0183? .3814± .0300? .3817± .0229? .4250± .0270? .3970± .0285? .3912± .0248? .5211± .0222? .2656± .0233? .4905± .0172? .4512± .0226? .5217± .0228? .5302± .0219
Macro-F1 .3382± .0209? .3764± .0298? .3724± .0254? .4137± .0283? .3867± .0287? .3882± .0218? .5114± .0212? .2633± .0219? .4791± .0176? .4363± .0247? .5113± .0217? .5200± .0207
Accuracy .3022± .0432? .2639± .0464? .3250± .0391? .3117± .0424? .2361± .0239? .2539± .0323? .3689± .0406? .2467± .0439? .3856± .0408 .3728± .0350 .3661± .0412? .3717± .0435
CW Precision .2882± .0422? .2736± .0556? .3269± .0444? .3157± .0446? .2263± .0359? .2990± .0407? .3910± .0477? .2546± .0491? .4157± .0495 .4431± .0345 .3895± .0481? .3926± .0497
Recall .3125± .0362? .2501± .0549? .3242± .0413? .3238± .0469? .2352± .0379? .2450± .0320? .3766± .0392? .2626± .0434? .3957± .0421 .3780± .0359 .3738± .0389? .3791± .0416
Macro-F1 .2790± .0337? .2501± .0515? .3079± .0415? .3055± .0427? .2187± .0316? .2537± .0327? .3613± .0366? .2456± .0428? .3896± .0416 .3907± .0318 .3594± .0366? .3632± .0388
Accuracy .3448± .0714? .2833± .0571? .2708± .0487? .3510± .0660? .3510± .0649? .2958± .0403? .4719± .0602 .2896± .0589? .4479± .0739 .3865± .0646? .4729± .0597 .4719± .0602
CD Precision .2899± .0656? .2935± .0613? .2688± .0498? .3251± .0672? .3312± .0838? .2858± .0464? .4224± .0801 .2784± .0514? .4415± .0776 .3989± .0741 .4215± .0802 .4224± .0801
Recall .3161± .0782? .2701± .0621? .2646± .0743? .3400± .0827? .3228± .0766? .2878± .0581? .4713± .0685 .2592± .0672? .4481± .0686 .3743± .0647? .4742± .0694 .4713± .0685
Macro-F1 .2791± .0590? .2528± .0563? .2439± .0534? .3113± .0630? .3071± .0710? .2640± .0448? .4180± .0649 .2485± .0517? .4142± .0607 .3459± .0545? .4192± .0648 .4180± .0649
Accuracy .3201± .0229? .3345± .0205? .3249± .0187? .3643± .0164? .3808± .0192? .3407± .0232? .4269± .0191 .2353± .0185? .4157± .0215? .3725± .0205? .4392± .0170 .4327± .0202
AC Precision .3290± .0246? .3534± .0233? .3319± .0194? .3569± .0172? .3827± .0169? .3589± .0230? .4451± .0193? .2511± .0235? .4226± .0228? .4084± .0349? .4550± .0199 .4565± .0254
Recall .3167± .0225? .3258± .0188? .3238± .0228? .3615± .0187? .3714± .0180? .3326± .0223? .4171± .0191? .2300± .0195? .4091± .0225? .3648± .0228? .4311± .0174 .4274± .0213
Macro-F1 .3144± .0223? .3239± .0189? .3175± .0192? .3532± .0165? .3669± .0157? .3313± .0210? .4133± .0163? .2263± .0188? .4081± .0205? .3414± .0206? .4271± .0157 .4232± .0207
Accuracy .3072± .0323? .2806± .0396? .2367± .0344? .2811± .0455? .2222± .0484? .2606± .0401? .3656± .0459 .2022± .0394? .3517± .0359 .3700± .0434 .3633± .0398 .3511± .0435
AW Precision .2871± .0354? .2954± .0463? .2468± .0349? .2892± .0447? .2023± .0339? .3316± .0430? .3954± .0400 .2201± .0511? .3588± .0425? .4533± .0464 .4043± .0501 .3835± .0408
Recall .3137± .0390? .2906± .0479? .2691± .0296? .3145± .0342? .2180± .0415? .2757± .0365? .3909± .0497 .1986± .0332? .3686± .0463 .3954± .0464 .3812± .0483 .3728± .0513
Macro-F1 .2879± .0315? .2774± .0422? .2496± .0282? .2814± .0349? .2010± .0360? .2914± .0378? .3682± .0415 .1887± .0353? .3443± .0394 .3969± .0413 .3670± .0408 .3533± .0445
Accuracy .2938± .0615? .3021± .0499? .3115± .0551? .4094± .0631 .3719± .0617 .3312± .0573? .3865± .0539 .3177± .0651? .4229± .0638 .3823± .0613 .3958± .0483 .3906± .0553
AD Precision .3067± .0716? .2918± .0478? .2692± .0524? .3625± .0433 .2985± .0522? .3456± .0481? .3699± .0573 .3280± .0776? .4445± .0753 .4043± .0642 .4073± .0775 .3872± .0651
Recall .3143± .0653? .3193± .0517? .3121± .0663? .3955± .0604? .3604± .0418? .3277± .0517? .4162± .0697 .3621± .0732? .4399± .0775 .3869± .0553? .4337± .0549 .4287± .0632
Macro-F1 .2804± .0613? .2799± .0349? .2703± .0483? .3499± .0456 .3001± .0373? .3167± .0410? .3614± .0486 .3130± .0699? .4069± .0686 .3628± .0519 .3785± .0534 .3687± .0531
Accuracy .2493± .0142? .2678± .0210? .2692± .0192? .2997± .0175? .3012± .0210? .2695± .0227? .3081± .0205? .2509± .0204? .3027± .0186? .2200± .0168? .3127± .0187? .3172± .0190
WC Precision .2376± .0131? .2412± .0194? .2441± .0183? .2778± .0191? .2788± .0207? .2566± .0232? .2971± .0207 .2268± .0200? .2743± .0192? .2663± .0188? .2937± .0174? .2995± .0188
Recall .2510± .0134? .2580± .0168? .2589± .0146? .2917± .0166? .2937± .0186? .2621± .0220? .3021± .0192? .2374± .0166? .2948± .0155? .2179± .0157? .3061± .0162? .3110± .0162
Macro-F1 .2261± .0122? .2423± .0178? .2453± .0163? .2778± .0169? .2777± .0184? .2533± .0226? .2894± .0197? .2249± .0178? .2742± .0165? .1981± .0118? .2893± .0163? .2942± .0171
Accuracy .3106± .0223? .2873± .0202? .2924± .0216? .3382± .0239? .3469± .0222 .3148± .0271? .3465± .0232? .3040± .0241? .3493± .0215? .3328± .0193? .3399± .0207? .3531± .0217
WA Precision .2908± .0227? .2808± .0152? .2820± .0166? .3241± .0260? .3312± .0211? .3262± .0267? .3490± .0258 .2719± .0193? .3390± .0227? .3666± .0198 .3539± .0166 .3446± .0278
Recall .3140± .0221? .2920± .0157? .2980± .0153? .3486± .0190? .3552± .0190 .3206± .0210? .3516± .0204? .3109± .0210? .3540± .0173? .3315± .0199? .3427± .0183? .3583± .0199
Macro-F1 .2864± .0203? .2781± .0155? .2798± .0161? .3285± .0207? .3379± .0183 .3134± .0216? .3393± .0192 .2820± .0184? .3361± .0172 .3128± .0187? .3354± .0179 .3394± .0191
Accuracy .8271± .0478? .6823± .0363? .7010± .0559? .7094± .0466? .7156± .0516? .6823± .0405? .8260± .0396? .6906± .0628? .8438± .0391? .8063± .0444? .8396± .0371? .8719± .0434
WD Precision .8082± .0680? .6755± .0627? .6884± .0828? .6763± .0806? .7022± .0686? .6542± .0716? .8267± .0521? .7034± .0767? .8419± .0519? .8166± .0682? .8515± .0420? .8875± .0533
Recall .8316± .0597? .6690± .0377? .7113± .0578? .7128± .0488? .7199± .0561? .6736± .0544? .8239± .0585? .7121± .0787? .8474± .0594? .8261± .0481? .8456± .0484? .8825± .0660
Macro-F1 .7982± .0683? .6394± .0431? .6792± .0669? .6712± .0596? .6824± .0614? .6427± .0637? .8104± .0534? .6896± .0733? .8264± .0558? .8038± .0614? .8270± .0443? .8707± .0597
Accuracy .2839± .0208? .2775± .0164? .2769± .0176? .2831± .0157? .2947± .0155? .2799± .0222? .3250± .0216? .2581± .0221? .3226± .0225? .2456± .0160? .3272± .0219? .3315± .0222
DC Precision .2512± .0205? .2710± .0130? .2723± .0130? .2716± .0146? .2650± .0190? .2820± .0243? .3160± .0298? .2407± .0259? .3165± .0299? .2654± .0289? .3206± .0288? .3299± .0309
Recall .2711± .0150? .2666± .0155? .2663± .0164? .2717± .0139? .2838± .0122? .2746± .0174? .3103± .0212? .2408± .0193? .3054± .0199? .2316± .0137? .3132± .0213? .3192± .0227
Macro-F1 .2501± .0174? .2614± .0138? .2614± .0145? .2636± .0131? .2647± .0142? .2637± .0176? .2867± .0237? .2322± .0177? .2878± .0222? .1879± .0150? .2911± .0235? .2948± .0246
Accuracy .3120± .0243? .3031± .0172? .2983± .0187? .3155± .0248? .3297± .0251? .2955± .0201? .3455± .0235? .2925± .0196? .3715± .0212 .2826± .0182? .3519± .0214? .3608± .0228
DA Precision .2946± .0255? .2769± .0108? .2620± .0118? .2907± .0176? .3105± .0313? .3013± .0237? .3467± .0273 .2824± .0215? .3582± .0279 .2917± .0157? .3549± .0306 .3508± .0283
Recall .3181± .0183? .3067± .0145? .3017± .0144? .3198± .0176? .3354± .0205? .2999± .0154? .3471± .0218? .2963± .0196? .3725± .0203 .2850± .0121? .3528± .0200? .3620± .0212
Macro-F1 .2912± .0209? .2843± .0114? .2736± .0115? .2955± .0164? .3133± .0221? .2846± .0152? .3286± .0215? .2786± .0174? .3476± .0201 .2546± .0096? .3302± .0198? .3391± .0207
Accuracy .6394± .0490? .6111± .0390? .7539± .0365 .6961± .0473? .6650± .0366? .6256± .0387? .7600± .0435 .6389± .0561? .7672± .0365 .7078± .0312? .7461± .0387? .7667± .0390
DW Precision .7202± .0487? .6644± .0503? .7905± .0358 .7256± .0500? .7209± .0444? .6972± .0366? .7870± .0422? .6618± .0558? .8115± .0372 .8024± .0361 .7773± .0381? .8012± .0373
Recall .6504± .0533? .6147± .0427? .7543± .0395? .6912± .0552? .6616± .0448? .6256± .0499? .7634± .0457 .6286± .0637? .7695± .0413 .7085± .0329? .7513± .0390? .7746± .0348
Macro-F1 .6432± .0523? .6109± .0436? .7601± .0362 .6936± .0528? .6637± .0440? .6264± .0461? .7607± .0438 .6288± .0619? .7632± .0414 .7156± .0348? .7498± .0381? .7715± .0344
win/tie/loss 48/0/0 48/0/0 45/3/0 45/3/0 44/4/0 48/0/0 28/17/3 48/0/0 24/16/8 34/8/6 29/15/4 ?
Tab.3  Four evaluation metrics: Accuracy, Precision, Recall, and Macro-F1 of different methods on Office+Caltech10 by NN classifier. The best results on each row are bolded, and ?// means that the NDTSL method is better/tied/worse than other methods. (pairwise single-tailed t-test at 95% confidence level)
Dataset Measures NN TCA JDA W-BDA JPDA GFK DTSL FSSP JLRFS PAS LRRCA NDTSL
Accuracy .6297± .0144? .5314± .0143? .5591± .0170? .5728± .0113? .5530± .0095? .5814± .0092? .6740± .0094 .5743± .0146? .6699± .0121? .5615± .0151? .6746± .0125? .6769± .0113
VS Precision .4131± .0125? .3831± .0128? .3856± .0130? .3836± .0087? .3801± .0098? .4004± .0085? .4774± .0194 .3804± .0130? .4882± .0411 .4547± .0176? .4812± .0271? .4837± .0270
Recall .4584± .0303? .4078± .0450? .4064± .0442? .3937± .0272? .4189± .0408? .4681± .0361? .5908± .0496 .4198± .0296? .5332± .0619? .5890± .0347 .5808± .0634? .5830± .0636
Macro-F1 .4154± .0158? .3629± .0131? .3743± .0159? .3759± .0117? .3741± .0143? .3995± .0121? .4954± .0230 .3825± .0170? .4914± .0374 .3920± .0215? .4957± .0312? .4985± .0310
Accuracy .5469± .0193? .4446± .0158? .5086± .0177? .5171± .0139? .5528± .0130? .5297± .0156? .5716± .0143 .4810± .0191? .5698± .0164 .4063± .0185? .5709± .0143 .5735± .0142
VL Precision .4915± .0280 .2741± .0096? .3281± .0298? .3920± .0409? .3742± .0279? .3996± .0232? .4874± .0268 .3313± .0344? .5042± .0238 .4039± .0466? .4869± .0268 .4891± .0216
Recall .4314± .0318? .3168± .0175? .3581± .0361? .3734± .0387? .3863± .0311? .3993± .0386? .4762± .0294? .3338± .0363? .4789± .0267? .4361± .0327? .4759± .0294? .4861± .0307
Macro-F1 .4231± .0261? .2646± .0109? .3296± .0295? .3575± .0358? .3673± .0279? .3766± .0267? .4477± .0246 .3036± .0333? .4615± .0206 .2961± .0247? .4473± .0245 .4564± .0221
Accuracy .5341± .0133? .5219± .0142? .5422± .0159? .5414± .0148? .5339± .0157? .5498± .0156? .5541± .0154? .6321± .0167 .5470± .0119? .3681± .0114? .5575± .0153 .5614± .0143
SV Precision .5382± .0340 .4276± .0498? .4269± .0360? .4076± .0482? .4442± .0455? .5502± .0345 .5915± .0531 .6527± .0177 .5148± .1309 .4335± .0116? .5884± .0537 .5543± .0295
Recall .4387± .0139? .3967± .0098? .4171± .0111? .4011± .0112? .4020± .0121? .4313± .0152? .4237± .0113? .5629± .0188 .4052± .0076? .4448± .0123 .4269± .0108? .4474± .0135
Macro-F1 .4226± .0195? .3560± .0112? .3731± .0114? .3667± .0109? .3723± .0145? .4226± .0180? .3990± .0143? .5833± .0189 .3597± .0094? .3917± .0109? .4019± .0143? .4362± .0166
Accuracy .5128± .0144? .5044± .0213? .5169± .0165? .5367± .0157? .5361± .0180? .5214± .0147? .5724± .0156 .5052± .0167? .5652± .0208? .2672± .0105? .5719± .0164 .5739± .0169
SL Precision .3446± .0834? .2781± .0353? .2966± .0411? .3114± .0630? .3754± .0810 .3839± .0985 .4385± .1490 .2469± .0283? .4184± .1566 .2979± .0121? .4387± .1493 .4136± .1350
Recall .2705± .0201? .2800± .0275 .2597± .0167? .2619± .0159? .2909± .0175 .2912± .0279 .2842± .0133? .2524± .0185? .2910± .0209 .1894± .0203? .2840± .0135 .2907± .0153
Macro-F1 .2466± .0255? .2594± .0227? .2491± .0154? .2467± .0123? .2642± .0158? .2767± .0304 .2764± .0198 .2296± .0160? .2793± .0268 .2007± .0103? .2764± .0201 .2820± .0222
Accuracy .5478± .0126? .4606± .0147? .5118± .0134? .5348± .0153? .4733± .0148? .5452± .0147? .5479± .0141? .5247± .0124? .5355± .0162? .1850± .0109? .5448± .0145? .5568± .0141
LV Precision .5286± .0147? .3793± .0260? .4347± .0157? .4444± .0178? .4508± .0228? .4841± .0140? .5366± .0178? .4778± .0167? .3546± .0151? .3143± .0119? .5380± .0177? .5444± .0176
Recall .4947± .0100 .3302± .0118? .4365± .0144? .4085± .0087? .3988± .0124? .4806± .0134 .4740± .0132? .4791± .0156 .4281± .0106? .2866± .0080? .4647± .0135? .4813± .0159
Macro-F1 .4824± .0104 .3109± .0136? .4306± .0148? .4022± .0104? .3861± .0145? .4787± .0134 .4820± .0135 .4773± .0154 .3788± .0107? .2093± .0093? .4707± .0139? .4815± .0158
Accuracy .4402± .0118? .3980± .0151? .4512± .0186? .4700± .0131? .4499± .0122? .4684± .0134? .4899± .0152? .4672± .0154? .4337± .0121? .3864± .0150? .4902± .0144? .5034± .0136
LS Precision .3320± .0175? .2755± .0146? .3232± .0147? .3318± .0079? .3332± .0116? .3477± .0129? .3500± .0226? .3345± .0129? .3805± .0267 .3342± .0170? .3505± .0231? .3599± .0219
Recall .3553± .0483? .2590± .0293? .3213± .0353? .3383± .0266? .3642± .0609? .4003± .0412 .3659± .0399? .3281± .0328? .3389± .0487? .4401± .0280 .3661± .0393? .3963± .0357
Macro-F1 .3000± .0197 .2322± .0141? .2986± .0172? .3106± .0097? .2986± .0122? .3231± .0143? .3325± .0196? .3071± .0164? .2606± .0190? .2417± .0107? .3328± .0200? .3509± .0211
win/tie/loss 20/3/1 23/1/0 24/0/0 24/0/0 22/2/0 18/6/0 12/11/1 18/2/4 14/8/2 21/2/1 15/8/1 ?
Tab.4  Four evaluation metrics: Accuracy, Precision, Recall, and Macro-F1 of different methods on VSL by NN classifier. The best results on each row are bolded, and ?// means that the NDTSL method is better/tied/worse than other methods. (pairwise single-tailed t-test at 95% confidence level)
Dataset Measures SVM TCA JDA W-BDA JPDA GFK DTSL FSSP JLRFS PAS LRRCA NDTSL
Accuracy .5033± .0157? .4453± .0210? .4347± .0223? .4668± .0179? .4627± .0211? .4545± .0216? .5161± .0204? .3203± .0178? .5128± .0264? .4512± .0215? .5224± .0241? .5403± .0223
CA Precision .5302± .0226? .4382± .0304? .4298± .0287? .4757± .0258? .4742± .0310? .4640± .0238? .5435± .0204? .3718± .0263? .5262± .0271? .4923± .0280? .5408± .0249? .5560± .0240
Recall .5065± .0186? .4486± .0218? .4363± .0223? .4715± .0202? .4676± .0228? .4608± .0232? .5184± .0206? .3218± .0186? .5150± .0258? .4512± .0226? .5255± .0227? .5434± .0223
Macro-F1 .4953± .0183? .4186± .0220? .4105± .0224? .4531± .0216? .4480± .0242? .4456± .0225? .5070± .0196? .3064± .0180? .5030± .0256? .4363± .0247? .5146± .0224? .5303± .0243
Accuracy .3361± .0362? .3472± .0381? .3672± .0377? .3694± .0391? .2944± .0352? .2672± .0379? .3917± .0474? .3350± .0462? .3850± .0387? .3728± .0350? .4239± .0462? .4411± .0364
CW Precision .3803± .0318? .3162± .0542? .3126± .0458? .3487± .0510? .3124± .0268? .2980± .0443? .4414± .0400? .3862± .0498? .4136± .0508? .4431± .0345 .3952± .0404? .4746± .0579
Recall .3494± .0453? .3613± .0281? .3815± .0229? .3683± .0334? .2986± .0225? .2759± .0396? .3943± .0418? .3399± .0349? .3929± .0408? .3780± .0359? .4204± .0402? .4407± .0330
Macro-F1 .3458± .0352? .2987± .0313? .3090± .0265? .3191± .0313? .2891± .0217? .2757± .0402? .4001± .0407? .3347± .0394? .3849± .0405? .3907± .0318? .3900± .0371? .4347± .0377
Accuracy .3948± .0461? .3865± .0606? .3510± .0575? .4750± .0664? .4281± .0636? .3813± .0478? .4719± .0617? .4365± .0681? .4688± .0543? .3865± .0646? .5312± .0751 .5146± .0612
CD Precision .3509± .0592? .2755± .0516? .2453± .0453? .4079± .0742 .3662± .0739 .3139± .0531? .4324± .0836 .3503± .0566? .3969± .0525 .3989± .0741 .4404± .0695 .3824± .0497
Recall .3811± .0469? .3288± .0546? .3004± .0564? .4532± .0829 .4035± .0647? .3560± .0498? .4626± .0640 .3869± .0658? .4137±0308? .3743± .0647? .5069± .0526 .4838± .0514
Macro-F1 .3421± .0470? .2763± .0474? .2540± .0461? .4010± .0691 .3600± .0617? .3141± .0470? .4193± .0678 .3418± .0577? .3902± .0400? .3459± .0545? .4428± .0552 .4075± .0462
Accuracy .4382± .0154? .3654± .0176? .3849± .0222? .3981± .0184? .4093± .0139? .3754± .0186? .4365± .0137? .2994± .0232? .4357± .0189? .3725± .0205? .4388± .0151? .4490± .0148
AC Precision .4812± .0172 .4094± .0190? .4028± .0204? .4186± .0186? .4471± .0142? .4225± .0211? .4494± .0172? .3310± .0295? .4742± .0237? .4084± .0349? .4530± .0181? .4844± .0235
Recall .4295± .0154? .3592± .0200? .3822± .0242? .3913± .0208? .3988± .0152? .3675± .0203? .4272± .0141? .2852± .0241? .4246± .0188? .3648± .0228? .4292± .0141? .4399± .0159
Macro-F1 .4288± .0148 .3559± .0191? .3806± .0213? .3925± .0182? .4086± .0132? .3685± .0185? .4200± .0142? .2802± .0229? .4213± .0177? .3414± .0206? .4223± .0152? .4350± .0150
Accuracy .3433± .0388? .2683± .0405? .3061± .0381? .3044± .0377? .2200± .0376? .3200± .0442? .3756± .0430? .2583± .0481? .3489± .0446? .3700± .0434? .3956± .0439? .4089± .0401
AW Precision .4072± .0475 .3081± .0624? .3136± .0345? .2823± .0359? .2651± .0468? .3804± .0523 .3790± .0520? .3347± .0459? .3509± .0472? .4533± .0464 .4039± .0494 .4088± .0468
Recall .3596± .0387? .2837± .0443? .3423± .0379? .3518± .0337? .2350± .0406? .3529± .0503? .4061± .0463? .2905± .0489? .3832± .0518? .3954± .0464? .4271± .0468? .4433± .0416
Macro-F1 .3592± .0399? .2658± .0454? .3177± .0324? .3003± .0342? .2308± .0404? .3324± .0558? .3763± .0373? .2670± .0492? .3542± .0438? .3969± .0413 .3983± .0388? .4074± .0369
Accuracy .3979± .0585? .3135± .0530? .3625± .0723? .4427± .0630 .3750± .0561? .3260± .0621? .3875± .0639? .3458± .0538? .4083± .0657 .3823± .0613? .3760± .0567? .4240± .0493
AD Precision .3959± .0930 .3181± .0503? .3180± .0582? .4368± .0856 .3152± .0359? .3409± .0626? .3635± .0785? .4329± .0413 .4075± .0567 .4043± .0642 .3996± .0738 .4055± .0651
Recall .4208± .0828? .3285± .0569? .3704± .0589? .4434± .0552 .3705± .0432? .3523± .0611? .4260± .0649? .3818± .0664? .4328± .0699? .3869± .0553? .4174± .0634? .4618± .0490
Macro-F1 .3737± .0703 .2859± .0474? .3192± .0526? .4101± .0623 .3208± .0333? .3176± .0548? .3477± .0499? .3689± .0522 .3838± .0612 .3628± .0519? .3698± .0589? .3906± .0476
Accuracy .3061± .0199? .2533± .0200? .2672± .0184? .3135± .0208? .3194± .0217? .2865± .0218? .2999± .0170? .2845± .0231? .3157± .0219? .2200± .0168? .3037± .0171? .3268± .0189
WC Precision .2851± .0243? .2554± .0247? .2625± .0208? .2763± .0361? 3029± .0217 .2983± .0396 .2917± .0169? .2549± .0226? .2887± .0196? .2663± .0188? .2871± .0162? .3053± .0163
Recall .3024± .0195? .2509± .0191? .2666± .0171? .3086± .0190? 3117± .0191? .2844± .0208? .2955± .0156? .2709± .0230? .3076± .0177? .2179± .0157? .2975± .0142? .3196± .0142
Macro-F1 .2801± .0187? .2361± .0191? .2542± .0171? .2698± .0189? 2994± .0197 .2597± .0197? .2856± .0145? .2568± .0217? .2861± .0187? .1981± .0118? .2825± .0144? .3004± .0144
Accuracy .3106± .0215? .2944± .0229? .2509± .0218? .3701± .0222 .3903± .0213 .2988± .0219? .3405± .0200? .3148± .0272? .3411± .0219? .3328± .0193? .3505± .0195? .3665± .0208
WA Precision .3345± .0194? .2940± .0187? .2460± .0202? .3455± .0189? .3888± .0231 .3177± .0223? .3501± .0225? .3025± .0272? .3294± .0229? .3666± .0198 .3583± .0250? .3772± .0264
Recall .3140± .0157? .2991± .0169? .2496± .0200? .3816± .0195 .3991± .0167 .3053± .0140? .3441± .0154? .3192± .0196? .3469± .0176? .3315± .0199? .3563± .0150? .3703± .0159
Macro-F1 .3096± .0163? .2908± .0161? .2414± .0174? .3572± .0180 .3851± .0180 .2986± .0140? .3369± .0160? .3013± .0213? .3291± .0173? .3128± .0187? .3450± .0156? .3596± .0159
Accuracy .8167± .0392? .6625± .0535? .6229± .0510? .7021± .0483? .7292± .0388? .6365± .0575? .8427± .0534? .7083± .0627? .8542± .0473? .8063± .0444? .8625± .0460 .8635± .0499
WD Precision .8203± .0688? .6582± .0684? .6297± .0740? .6819± .0892? .7280± .0543? .6350± .0970? .8480± .0729? .6890± .0848? .8664± .0645 .8166± .0682? .8731± .0562 .8718± .0637
Recall .8024± .0697? .6515± .0673? .6576± .0545? .6866± .0734? .7338± .0446? .6356± .0587? .8483± .0840? .6971± .0679? .8587± .0696? .8261± .0481? .8782± .0579 .8684± .0724
Macro-F1 .7925± .0707? .6197± .0627? .6102± .0589? .6563± .0783? .6952± .0458? .5984± .0715? .8307± .0805? .6690± .0755? .8441± .0691? .8038± .0614? .8588± .0602 .8543± .0698
Accuracy .3041± .0197? .2757± .0199? .2839± .0200? .3084± .0149? .3070± .0154? .2794± .0231? .3018± .0222? .2811± .0196? .3203± .0225? .2456± .0160? .3050± .0225? .3274± .0205
DC Precision .2931± .0276? .2457± .0198? .2443± .0182? .2530± .0161? .2559± .0197? .2815± .0407? .2940± .0262? .2813± .0279? .2970± .0251? .2654± .0289? .2941± .0259? .3161± .0270
Recall .2888± .0178? .2693± .0167? .2772± .0159? .2975± .0116? .2958± .0133? .2631± .0176? .2875± .0178? .2601± .0185? .3051± .0196? .2316± .0137? .2909± .0195? .3139± .0167
Macro-F1 .2653± .0185? .2414± .0169? .2405± .0155? .2518± .0117? .2514± .0140? .2427± .0187? .2706± .0211? .2504± .0187? .2876± .0212 .1879± .0150? .2726± .0215? .2850± .0196
Accuracy .3378± .0223? .3128± .0245? .2760± .0196? .3358± .0243? .3512± .0188? .2983± .0261? .3436± .0213? .3201± .0228? .3642± .0251 .2826± .0182? .3549± .0224? .3653± .0231
DA Precision .3561± .0326 .3163± .0408? .2359± .0111? .3129± .0201? .3220± .0153? .3215± .0377? .3412± .0314? .3092± .0248? .3494± .0284 .2917± .0157? .3631± .0302 .3552± .0279
Recall .3373± .0183? .3188± .0168? .2823± .0145? .3405± .0155? .3562± .0178 .3022± .0201? .3438± .0199? .3240± .0211? .3654± .0219 .2850± .0121? .3549± .0207? .3658± .0220
Macro-F1 .3120± .0194? .2781± .0165? .2477± .0105? .3111± .0166? .3333± .0153? .2803± .0201? .3219± .0210? .3076± .0215? .3418± .0227 .2546± .0096? .3354± .0202? .3430± .0208
Accuracy .6711± .0492? .5172± .0432? .5783± .0455? .6511± .0446? .6339± .0372? .4700± .0486? .7344± .0419? .6928± .0439? .7639± .0371 .7078± .0312? .7417± .0395? .7606± .0384
DW Precision .7521± .0426? .5582± .0894? .6184± .0451? .7133± .0513? .6811± .0379? .5669± .0661? .7720± .0378? .6913± .0539? .8090± .0376 .8024± .0361 .7761± .0379? .7977± .0368
Recall .6738± .0489? .5181± .0368? .5757± .0423? .6489± .0455? .6563± .0417? .4780± .0462? .7383± .0449? .6933± .0423? .7669± .0411 .7085± .0329? .7429± .0412? .7669± .0371
Macro-F1 .6720± .0519? .4632± .0363? .5617± .0385? .6524± .0473? .6430± .0417? .4557± .0473? .7367± .0405? .6773± .0473? .7602± .0411 .7156± .0348? .7452± .0378? .7641± .0357
win/tie/loss 42/6/0 48/0/0 48/0/0 38/9/1 40/5/3 46/2/0 45/2/1 46/2/0 34/13/1 41/6/1 37/7/4 ?
Tab.5  Four evaluation metrics: Accuracy, Precision, Recall, and Macro-F1 of different methods on Office+Caltech10 by SVM classifier. The best results on each row are bolded, and ?// means that the NDTSL method is better/tied/worse than other methods. (pairwise single-tailed t-test at 95% confidence level)
Dataset Measures NN TCA JDA W-BDA JPDA GFK DTSL FSSP JLRFS PAS LRRCA NDTSL
Accuracy .6888± .0117 .6426± .0131? .7013± .0133? .6804± .0126? .6781± .0148? .6936± .0127? .7140± .0111? .6822± .0132? .7199± .0132 .5615± .0151? .7170± .0096? .7209± .0105
VS Precision .5022± .0392? .4533± .0096? .4612± .0143? .4507± .0150? .4644± .0205? .4907± .0250? .5696± .0616 .5036± .0948? .5718± .0866 .4547± .0176? .5441± .0505? .5714± .0566
Recall .5410± .0680? .5638± .0260 .5336± .0324? .4543± .0232? .5580± .0478 .5881± .0356 .5896± .0561 .5614± .0479 .5402± .0652? .5890± .0347 .5982± .0552 .5785± .0623
Macro-F1 .4980± .0355? .4286± .0121? .4661± .0191? .4408± .0155? .4675± .0223? .4812± .0216? .5355± .0385? .4692± .0308? .5305± .0499 .3920± .0215? .5323± .0351? .5440± .0349
Accuracy .5825± .0136? .5595± .0140? .5857± .0200? .6109± .0108? .6471± .0129 .5937± .0148? .6209± .0162? .6074± .0157? .6207± .0109? .4063± .0185? .6335± .0139? .6382± .0134
VL Precision .5613± .0352? .2974± .0068? .3157± .0175? .3442± .0149? .3918± .0309? .4168± .0388? .2512± .0062? .4970± .1010? .6971± .1001 .4039± .0466? .5623± .0955? .6005± .0717
Recall .4781± .0305 .3931± .0147? .3717± .0207? .4159± .0184 .4138± .0201 .4207± .0249 .2706± .0061? .3979± .0243? .4056± .0243 .4361± .0327 .3121± .0146? .4158± .0331
Macro-F1 .4719± .0272 .2920± .0089? .3241± .0184? .3661± .0154? .3978± .0200? .3975± .0210? .2591± .0062? .4110± .0255? .4509± .0287 .2961± .0247? .3284± .0236? .4526± .0371
Accuracy .5650± .0133? .6118± .0135? .6227± .0147 .6071± .0129? .6138± .0150? .5848± .0147? .5845± .0142? .6366± .0152 .5906± .0117? .3681± .0114? .5846± .0143? .6230± .0153
SV Precision .5058± .1321? .3782± .0082? .3919± .0086? .3720± .0084? .3978± .0081? .4067± .0073? .3969± .0079? .4287± .0071? .6595± .1049 .4335± .0116? .3965± .0083? .6723± .0340
Recall .4176± .0084? .4551± .0084? .4594± .0094? .4413± .0086? .4427± .0091? .4062± .0090? .4104± .0083? .4505± .0074? .4517± .0104? .4448± .0123? .4107± .0084? .4858± .0157
Macro-F1 .3791± .0105? .4109± .0074? .4205± .0083? .4009± .0074? .4138± .0080? .3883± .0082? .3869± .0081? .4299± .0071? .4321± .0132? .3917± .0109? .3871± .0083? .4955± .0191
Accuracy .5774± .0146? .5435± .0157? .5765± .0156? .5951± .0154 .4606± .0152? .5558± .0166? .5744± .0142? .5231± .0150? .5764± .0142? .2672± .0105? .5805± .0154 .5821± .0128
SL Precision .3839± .1248? .2637± .0065? .2797± .0210? .2762± .0186? .2437± .0073? .2520± .0132? .3889± .1056 .2623± .0212? .4315± .1334 .2979± .0121? .3925± .1089 .4101± .1553
Recall .2831± .0145 .2826± .0123 .2678± .0106? .2737± .0100 .2330± .0112? .2568± .0104? .2846± .0141 .2447± .0085? .2838± .0153 .1894± .0203? .2864± .0155 .2818± .0238
Macro-F1 .2738± .0202? .2598± .0064? .2540± .0141? .2614± .0125? .2251± .0063? .2403± .0125? .2755± .0208 .2207± .0120? .2767± .0233 .2007± .0103? .2801± .0230 .2784± .0336
Accuracy .5677± .0120? .4375± .0121? .5376± .0159? .5874± .0141? .5599± .0136? .5683± .0160? .6196± .0138? .6669± .0148 .6000± .0124? .1850± .0109? .6206± .0144? .6497± .0150
LV Precision .5673± .0237? .1734± .0050? .2145± .0055? .3555± .0112? .3194± .0102? .3189± .0093? .6705± .0928 .5738± .0110? .5718± .0164? .3143± .0119? .6534± .0915 .6507± .0303
Recall .4635± .0118? .2776± .0058? .3293± .0042? .4286± .0107? .4320± .0086? .4585± .0096? .4706± .0096? .5625± .0133 .5533± .0137 .2866± .0080? .4726± .0099? .5438± .0205
Macro-F1 .4437± .0143? .2077± .0046? .2597± .0046? .3812± .0100? .3617± .0084? .3720± .0094? .4298± .0098? .5570± .0121 .5490± .0137 .2093± .0093? .4355± .0113? .5513± .0237
Accuracy .4709± .0149? .4656± .0147? .4736± .0143? .4638± .0145? .4855± .0144? .4565± .0150? .4872± .0107? .4568± .0154? .5065± .0133? .3864± .0150? .4877± .0112? .5279± .0113
LS Precision .4751± .0959? .1857± .0063? .1882± .0056? .2217± .0367? .1984± .0061? .2056± .0173? .3900± .0243? .2420± .0074? .4267± .0534? .3342± .0170? .3898± .0241? .4862± .0889
Recall .3752± .0445? .2677± .0059? .2835± .0058? .2970± .0320? .2824± .0070? .3477± .0523? .4530± .0475? .2539± .0044? .4349± .0423 .4401± .0280 .4531± .0474 .4301± .0430
Macro-F1 .3238± .0338? .2153± .0060? .2260± .0056? .2441± .0283? .2294± .0063? .2509± .0203? .3828± .0247? .2082± .0063? .3890± .0265 .2417± .0107? .3830± .0246? .4040± .0321
win/tie/loss 20/2/2 22/2/0 23/1/0 21/2/1 22/2/0 22/2/0 18/6/0 19/2/3 10/12/2 21/2/1 17/5/2 ?
Tab.6  Four evaluation metrics: Accuracy, Precision, Recall, and Macro-F1 of different methods on VSL by SVM classifier. The best results on each row are bolded, and ?// means that the NDTSL method is better/tied/worse than other methods. (pairwise single-tailed t-test at 95% confidence level)
Fig.2  The curves of four evaluation metrics for different methods on transfer tasks: A→W, A→D, and V→L by SVM classifier. (a) A→W; (b) A→W; (c) A→W; (d) A→W; (e) A→D; (f) A→D; (g) A→D; (h) A→D; (i) V→L; (i) V→L; (k) V→L; (l) V→L
Fig.3  The curves of four evaluation metrics for different methods on transfer tasks V→L by NN classifier. (a) accuracy; (b) precision; (c) recall; (d) macro-F1
Fig.4  The convergence of objective function values and classification accuracy curves on four transfer tasks. (a) A→C; (b) D→A; (c) S→V; (d) L→S
Dataset FSSP DTSL JLRFS PAS LRRCA NDTSL
A→C 1.98 2.47 5.01 15.96 3.27 3.07
D→A 0.55 0.39 0.73 11.15 0.70 0.52
S→V 79.79 116.02 240.07 719.39 206.59 146.27
L→S 58.63 83.20 166.24 617.51 167.34 109.68
Tab.7  The computation cost of transfer subspace learning methods on four transfer tasks by NN classifier
Fig.5  The influence of hyperparameters α and β for accuracy on four transfer tasks by NN classifier. (a) A→C; (b) D→A; (c) S→V; (d) L→S
Fig.6  The accuracy curves vary with the hyperparameters λ and p on four transfer tasks by NN classifier. (a) λ; (b) p
Dataset CA CW CD AC AW AD
Low-rank .5280±.0226 .3961±.0463 .4719±.0617 .4380±.0140 .3867±.0438 .3688±.0570
Strict label .5229±.0221 .4139±.0332 .4594±.0571 .4280±.0153 .3694±.0401 .3854±.0543
NDTSL .5403±.0223 .4411±.0364 .5146±.0612 .4490±.0148 .4089±.0401 .4240±.0493
Dataset WC WA WD DC DA DW
Low-rank .3111±.0200 .3497±.0192 .8583±.0486 .3077±.0216 .3564±.0227 .7383±.0448
Strict label .3034±.0198 .3589±.0199 .8240±.0490 .3155±.0227 .3608±.0207 .7456±.0395
NDTSL .3268±.0189 .3665±.0208 .8635±.0499 .3274±.0205 .3653±.0231 .7606±.0384
Tab.8  Accuracy of different methods on Office+Caltech10 by SVM classifier
  
  
1 A Margolis . A literature review of domain adaptation with unlabeled data. Washington: University of Washington, 2011, 1−42
2 You K, Long M, Cao Z, Wang J, Jordan M I. Universal domain adaptation. In: Proceedings of IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2019, 2715−2724
3 W M, Kouw M Loog . A review of domain adaptation without target labels. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2021, 43( 3): 766–785
4 A, Farahani S, Voghoei K, Rasheed H R Arabnia . A brief review of domain adaptation. In: Stahlbock R, Weiss G M, Abou-Nasr M, Yang C Y, Arabnia H R, Deligiannidis L, eds. Advances in Data Science and Information Engineering. Cham: Springer, 2021, 877−894
5 V M, Patel R, Gopalan R, Li R Chellappa . Visual domain adaptation: a survey of recent advances. IEEE Signal Processing Magazine, 2015, 32( 3): 53–69
6 G Csurka . Domain Adaptation in Computer Vision Applications. Cham: Springer, 2017
7 J Jiang . Domain adaptation in natural language processing. University of Illinois at Urbana-Champaign, Dissertation, 2008
8 C S, Perone P, Ballester R C, Barros J Cohen-Adad . Unsupervised domain adaptation for medical imaging segmentation with self-ensembling. NeuroImage, 2019, 194: 1–11
9 Y, Zhang Y, Wei Q, Wu P, Zhao S, Niu J, Huang M Tan . Collaborative unsupervised domain adaptation for medical image diagnosis. IEEE Transactions on Image Processing, 2020, 29: 7834–7844
10 H, Guan M Liu . Domain adaptation for medical image analysis: a survey. IEEE Transactions on Biomedical Engineering, 2022, 69( 3): 1173–1185
11 S J, Pan I W, Tsang J T, Kwok Q Yang . Domain adaptation via transfer component analysis. IEEE Transactions on Neural Networks, 2011, 22( 2): 199–210
12 M, Long J, Wang G, Ding J, Sun P S Yu . Transfer feature learning with joint distribution adaptation. In: Proceedings of the IEEE International Conference on Computer Vision. 2013, 2200−2207
13 J, Wang Y, Chen S, Hao W, Feng Z Shen . Balanced distribution adaptation for transfer learning. In: Proceedings of the IEEE International Conference on Data Mining. 2017, 1129−1134
14 Zhang W, Wu D. Discriminative joint probability maximum mean discrepancy (DJP-MMD) for domain adaptation. In: Proceedings of IEEE International Joint Conference on Neural Networks. 2020, 1−8
15 W, Wang H, Li Z, Ding F, Nie J, Chen X, Dong Z Wang . Rethinking maximum mean discrepancy for visual domain adaptation. IEEE Transactions on Neural Networks and Learning Systems, 2023, 34( 1): 264–277
16 A, Gretton K M, Borgwardt M J, Rasch B, Schölkopf A Smola . A kernel two-sample test. The Journal of Machine Learning Research, 2012, 13: 723–773
17 Fernando B, Habrard A, Sebban M, Tuytelaars T. Unsupervised visual domain adaptation using subspace alignment. In: Proceedings of IEEE International Conference on Computer Vision. 2013, 2960−2967
18 B, Sun K Saenko . Subspace distribution alignment for unsupervised domain adaptation. In: Proceedings of the British Machine Vision Conference. 2015, 24.1−24.10
19 Sun B, Feng J, Saenko K. Return of frustratingly easy domain adaptation. In: Proceedings of the 30th AAAI Conference on Artificial Intelligence. 2016, 2058−2065
20 Gopalan R, Li R, Chellappa R. Domain adaptation for object recognition: an unsupervised approach. In: Proceedings of IEEE International Conference on Computer Vision. 2011, 999−1006
21 Gong B, Shi Y, Sha F, Grauman K. Geodesic flow kernel for unsupervised domain adaptation. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition. 2012, 2066−2073
22 M, Shao D, Kit Y Fu . Generalized transfer subspace learning through low-rank constraint. International Journal of Computer Vision, 2014, 109( 1−2): 74–93
23 Y, Xu X, Fang J, Wu X, Li D Zhang . Discriminative transfer subspace learning via low-rank and sparse representation. IEEE Transactions on Image Processing, 2016, 25( 2): 850–863
24 Li J, Zhao J, Lu K. Joint feature selection and structure preservation for domain adaptation. In: Proceedings of the 25th International Joint Conference on Artificial Intelligence. 2016, 1697−1703
25 Z, Lin Z, Zhao T, Luo W, Yang Y, Zhang Y Tang . Non-convex transfer subspace learning for unsupervised domain adaptation. In: Proceedings of the IEEE International Conference on Multimedia and Expo. 2019, 1468−1473
26 L, Yang Q Zhou . Transfer subspace learning joint low-rank representation and feature selection. Multimedia Tools and Applications, 2022, 81( 27): 38353–38373
27 W, Li S Chen . Unsupervised domain adaptation with progressive adaptation of subspaces. Pattern Recognition, 2022, 132: 108918
28 P, Razzaghi P, Razzaghi K Abbasi . Transfer subspace learning via low-rank and discriminative reconstruction matrix. Knowledge-Based Systems, 2019, 163: 174–185
29 T, Xiao P, Liu W, Zhao H, Liu X Tang . Structure preservation and distribution alignment in discriminative transfer subspace learning. Neurocomputing, 2019, 337: 218–234
30 H, Xia T, Jing Z Ding . Maximum structural generation discrepancy for unsupervised domain adaptation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2023, 45( 3): 3434–3445
31 Y, Madadi V, Seydi R Hosseini . Multi-source domain adaptation-based low-rank representation and correlation alignment. International Journal of Computers and Applications, 2022, 44( 7): 670–677
32 L, Yang B, Lu Q, Zhou P Su . Unsupervised domain adaptation via re-weighted transfer subspace learning with inter-class sparsity. Knowledge-Based Systems, 2023, 263: 110277
33 G, Liu Z, Lin S, Yan J, Sun Y, Yu Y Ma . Robust recovery of subspace structures by low-rank representation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2013, 35( 1): 171–184
34 Fazel M, Hindi H, Boyd S P. A rank minimization heuristic with application to minimum order system approximation. In: Proceedings of IEEE American Control Conference. 2001, 4734−4739
35 X, Fang Y, Xu X, Li Z, Lai W K, Wong B Fang . Regularized label relaxation linear regression. IEEE Transactions on Neural Networks and Learning Systems, 2018, 29( 4): 1006–1018
36 Y, Wang W, Yin J Zeng . Global convergence of ADMM in nonconvex nonsmooth optimization. Journal of Scientific Computing, 2019, 78( 1): 29–63
37 Nie F, Wang H, Cai X, Huang H, Ding C. Robust matrix completion via joint schatten p-Norm and lp-norm minimization. In: Proceedings of the 12th IEEE International Conference on Data Mining. 2012, 566−574
38 Z, Lin M, Chen L, Wu Y Ma . The augmented Lagrange multiplier method for exact recovery of corrupted low-rank matrices. Urbana: Coordinated Science Laboratory, 2009
39 K, Saenko B, Kulis M, Fritz T Darrell . Adapting visual category models to new domains. In: Proceedings of the 11th European Conference on Computer Vision. 2010, 213−226
40 G, Griffin A, Holub P Perona . Caltech-256 object category dataset. Pasadena: California Institute of Technology, 2007
41 M, Everingham Gool L, Van C K I, Williams J, Winn A Zisserman . The PASCAL visual object classes (VOC) challenge. International Journal of Computer Vision, 2010, 88( 2): 303–338
42 B C, Russell A, Torralba K P, Murphy W T Freeman . LabelMe: a database and web-based tool for image annotation. International Journal of Computer Vision, 2008, 77( 1−3): 157–173
43 Choi M J, Lim J J, Torralba A, Willsky A S. Exploiting hierarchical context on a large database of object categories. In: Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition. 2010, 129−136
44 H, Bay T, Tuytelaars Gool L Van . SURF: speeded up robust features. In: Proceedings of the 9th European Conference on Computer Vision. 2006, 404−417
45 J, Donahue Y, Jia O, Vinyals J, Hoffman N, Zhang E, Tzeng T Darrell . DeCAF: a deep convolutional activation feature for generic visual recognition. In: Proceedings of the 31st International Conference on Machine Learning. 2014, I-647−I-655
[1] FCS-23228-OF-YL_suppl_1 Download
[1] Yi ZHU, Xindong WU, Jipeng QIANG, Yunhao YUAN, Yun LI. Representation learning via an integrated autoencoder for unsupervised domain adaptation[J]. Front. Comput. Sci., 2023, 17(5): 175334-.
[2] Yunyun WANG, Chao WANG, Hui XUE, Songcan CHEN. Self-corrected unsupervised domain adaptation[J]. Front. Comput. Sci., 2022, 16(5): 165323-.
[3] Xuejun WANG, Feilong CAO, Wenjian WANG. Adaptive sparse and dense hybrid representation with nonconvex optimization[J]. Front. Comput. Sci., 2020, 14(4): 144306-.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed