Please wait a minute...
Frontiers of Computer Science

ISSN 2095-2228

ISSN 2095-2236(Online)

CN 10-1014/TP

邮发代号 80-970

2019 Impact Factor: 1.275

Frontiers of Computer Science  2022, Vol. 16 Issue (6): 166349   https://doi.org/10.1007/s11704-022-1765-6
  本期目录
Non-salient region erasure for time series augmentation
Pin LIU1,2,3, Xiaohui GUO1,2,3(), Bin SHI4, Rui WANG1,2,3, Tianyu WO1,2,3, Xudong LIU1,2,3
1. State Key Lab of Software Development Environment (SKLSDE), Beihang University, Beijing 100191, China
2. Beijing Advanced Innovation Center for Big Data and Brain Computing (BDBC), Beijing 100191, China
3. Hangzhou Innovation Institute, Beihang University, Hangzhou 310052, China
4. School of Computer Science and Technology, Xi’an Jiaotong University, Xi’an 710049, China
 全文: PDF(970 KB)   HTML
收稿日期: 2021-12-20      出版日期: 2022-04-27
Corresponding Author(s): Xiaohui GUO   
作者简介:

Peng Lu, Renxing Wang, and Yue Xing contributed equally to this work.

 引用本文:   
. [J]. Frontiers of Computer Science, 2022, 16(6): 166349.
Pin LIU, Xiaohui GUO, Bin SHI, Rui WANG, Tianyu WO, Xudong LIU. Non-salient region erasure for time series augmentation. Front. Comput. Sci., 2022, 16(6): 166349.
 链接本文:  
https://academic.hep.com.cn/fcs/CN/10.1007/s11704-022-1765-6
https://academic.hep.com.cn/fcs/CN/Y2022/V16/I6/166349
Fig.1  
DataSet None wDBA RGW-sD SeaM
Adiac 84.78(0.38) 84.53(1.92) 75.06(0.38) 86.13(0.83)
ArrowHead 84.30(1.50) 88.00(2.00) 86.00(0.29) 89.57(0.71)
DiatomSR. 31.30(3.60) 76.14(0.00) 91.01(2.45) 97.30(0.41)
PowerCons 85.06(0.24) 86.39(2.50) 86.67(0.28) 89.03(0.97)
FordA 90.40(0.20) 90.00(1.03) 87.12(2.00) 91.99(0.28)
Worms 76.50(2.20) 80.58(3.50) 79.06(0.30) 81.17(0.65)
Lightning7 82.70(2.30) 89.65(5.00) 83.56(0.00) 90.07(3.08)
Earthquakes 72.70(1.70) 72.66(1.28) 74.10(3.33) 74.10(1.44)
ElectricD. 70.02(1.20) 71.66(1.04) 71.97(2.06) 74.49(2.26)
CROP 69.74(3.11) 71.51(1.52) 70.55(1.71) 72.55(3.24)
Average 74.75(1.64) 81.11(2.00) 80.51(1.28) 84.64(1.39)
Tab.1  
DataSet None wDBA RGW-sD SeaM
Adiac 82.90(0.60) 83.63(1.53) 75.70(0.77) 84.14(1.02)
ArrowHead 84.50(1.20) 86.00(0.86) 88.57(1.14) 89.43(0.86)
DiatomSR. 30.10(0.20) 80.07(0.33) 97.55(0.49) 98.12(0.25)
PowerCons 88.12(1.60) 86.67(1.67) 90.00(1.67) 90.00(0.56)
FordA 92.00(0.40) 93.45(0.05) 92.83(1.20) 93.54(0.32)
Worms 79.10(2.50) 78.06(6.75) 71.43(3.75) 80.84(2.27)
Lightning7 84.50(2.00) 86.47(2.00) 84.93(2.74) 86.99(0.68)
Earthquakes 71.20(2.00) 74.82(0.25) 73.38(0.10) 71.76(3.06)
ElectricD. 72.90(0.90) 73.56(0.62) 73.17(1.03) 75.62(0.85)
CROP 68.52(2.33) 69.15(2.15) 71.74(1.62) 74.23(2.03)
Average 75.38(1.37) 81.19(1.62) 81.93(1.45) 84.47(1.19)
Tab.2  
1 M Olson A J Wyner R Berk. Modern neural networks generalize on small data sets. In: Proceedings of the 32nd International Conference on Neural Information Processing Systems. 2018, 3623– 3632
2 J Wang Z Wang J Li J Wu. Multilevel wavelet decomposition network for interpretable time series analysis. In: Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 2018, 2437– 2446
3 W Yang H Huang Z Zhang X Chen K Huang S Zhang. Towards rich feature discovery with class activation maps augmentation for person re-identification. In: Proceedings of 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2019, 1389– 1398
4 J Wang , Z Peng , X Wang , C Li , J Wu . Deep fuzzy cognitive maps for interpretable multivariate time series prediction. IEEE Transactions on Fuzzy Systems, 2021, 29( 9): 2647– 2660
5 D Lee S Lee H Yu. Learnable dynamic temporal pooling for time series classification. In: Proceedings of the 35th AAAI Conference on Artificial Intelligence. 2021, 8288– 8296
6 N Chen , J Zhu , J Chen , T Chen . Dropout training for SVMs with data augmentation. Frontiers of Computer Science, 2018, 12( 4): 694– 713
7 G Forestier F Petitjean H A Dau G I Webb E Keogh. Generating synthetic time series to augment sparse datasets. In: Proceedings of 2017 IEEE International Conference on Data Mining. 2017, 865– 870
8 B K Iwana S Uchida. Time series data augmentation for neural networks by time warping with a discriminative teacher. In: Proceedings of the 25th International Conference on Pattern Recognition. 2021, 3558– 3565
[1] Highlights Download
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed