Please wait a minute...
Frontiers of Computer Science

ISSN 2095-2228

ISSN 2095-2236(Online)

CN 10-1014/TP

Postal Subscription Code 80-970

2018 Impact Factor: 1.129

Front. Comput. Sci.    2024, Vol. 18 Issue (6) : 186352    https://doi.org/10.1007/s11704-024-40110-9
Artificial Intelligence
ISM: intra-class similarity mixing for time series augmentation
Pin LIU1,2, Rui WANG2(), Yongqiang HE1, Yuzhu WANG1
1. School of Information Engineering, China University of Geosciences, Beijing 100083, China
2. State Key Lab of Software Development Environment, Beihang University, Beijing 100191, China
 Download: PDF(1264 KB)   HTML
 Export: BibTeX | EndNote | Reference Manager | ProCite | RefWorks
Corresponding Author(s): Rui WANG   
Just Accepted Date: 13 May 2024   Issue Date: 07 June 2024
 Cite this article:   
Pin LIU,Rui WANG,Yongqiang HE, et al. ISM: intra-class similarity mixing for time series augmentation[J]. Front. Comput. Sci., 2024, 18(6): 186352.
 URL:  
https://academic.hep.com.cn/fcs/EN/10.1007/s11704-024-40110-9
https://academic.hep.com.cn/fcs/EN/Y2024/V18/I6/186352
Fig.1  The overall pipeline of ISM performs time series augmentation from two original time series to augmented time series
Dataset None DTWmerge wDBA DGW-sD ISM
F R F R F R F R F R
Wine58.774.081.577.884.092.673.288.188.994.9
Car90.592.567.578.390.792.792.393.895.095.3
DistalPT.69.066.569.868.473.469.372.069.076.371.9
Lightning782.784.578.880.889.786.583.684.989.989.0
Yoga82.787.071.482.779.087.481.888.088.289.1
WormsTC.62.574.779.279.280.176.480.480.383.187.8
WordS.46.452.237.552.047.454.049.157.451.460.7
Computers72.681.580.679.684.072.471.684.086.486.6
Mallat96.796.675.891.694.892.495.794.698.997.6
FordA90.493.490.393.190.093.587.192.891.694.7
Average75.280.373.278.481.381.778.783.385.086.8
Tab.1  Classification performance (Mean Acc.%). F and R represent the backbones of the FCN and ResNet, respectively
MethodWordS.FordAMallatAverage
NoneFCN2.19.23.85.0
wDBA1819.012542.52781.25714.2
DGW-sD105.2969.6204.0426.3
ISM61.1467.9131.8220.2
NoneResNet37.2120.556.571.4
wDBA1832.612530.22786.85716.5
DGW-sD121.1935.4221.6426.0
ISM62.3478.6134.2225.0
Tab.2  The time consumption (minute) on the typical datasets
Fig.2  The influence of batch size on ISM’s performance. fcn_sm and resnet_sm indicate the FCN and ResNet classification models when using ISM for data augmentation, respectively. (a) FordA; (b) WormsTC.
1 W Buntine . Machine learning after the deep learning revolution. Frontiers of Computer Science, 2020, 14( 6): 146320
2 F, Zhang K H, Ngo S, Yang A Nosratinia . Transmit correlation diversity: Generalization, new techniques, and improved bounds. IEEE Transactions on Information Theory, 2022, 68( 6): 3841–3869
3 K, Wei T, Li F, Huang J, Chen Z He . Cancer classification with data augmentation based on generative adversarial networks. Frontiers of Computer Science, 2022, 16( 2): 162601
4 M, Ragab E, Eldele W L, Tan C S, Foo Z, Chen M, Wu C K, Kwoh X Li . ADATIME: a benchmarking suite for domain adaptation on time series data. ACM Transactions on Knowledge Discovery from Data, 2023, 17( 8): 106
5 B, Kim M A, Alawami E, Kim S, Oh J, Park H Kim . A comparative study of time series anomaly detection models for industrial control systems. Sensors, 2023, 23( 3): 1310
6 G, Li J J Jung . Deep learning for anomaly detection in multivariate time series: approaches, applications, and challenges. Information Fusion, 2023, 91: 93–102
7 S, Yun D, Han S, Chun S J, Oh Y, Yoo J Choe . CutMix: regularization strategy to train strong classifiers with localizable features. In: Proceedings of the IEEE/CVF International Conference on Computer Vision. 2019, 6022−6031
8 D, Bellos M, Basham T, Pridmore A P French . A convolutional neural network for fast upsampling of undersampled tomograms in X-ray CT time-series using a representative highly sampled tomogram. Journal of Synchrotron Radiation, 2019, 26( 3): 839–853
9 B K, Iwana S Uchida . Time series data augmentation for neural networks by time warping with a discriminative teacher. In: Proceedings of the 25th International Conference on Pattern Recognition (ICPR). 2021, 3558−3565
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed