Please wait a minute...
Frontiers of Computer Science

ISSN 2095-2228

ISSN 2095-2236(Online)

CN 10-1014/TP

Postal Subscription Code 80-970

2018 Impact Factor: 1.129

Front. Comput. Sci.    2023, Vol. 17 Issue (6) : 176340    https://doi.org/10.1007/s11704-022-2256-5
Artificial Intelligence
Aspect-level sentiment analysis based on semantic heterogeneous graph convolutional network
Yufei ZENG1, Zhixin LI1(), Zhenbin CHEN1, Huifang MA2
1. Guangxi Key Lab of Multi-source Information Mining and Security, Guangxi Normal University, Guilin 541004, China
2. College of Computer Science and Engineering, Northwest Normal University, Lanzhou 730070, China
 Download: PDF(5446 KB)   HTML
 Export: BibTeX | EndNote | Reference Manager | ProCite | RefWorks
Abstract

The deep learning methods based on syntactic dependency tree have achieved great success on Aspect-based Sentiment Analysis (ABSA). However, the accuracy of the dependency parser cannot be determined, which may keep aspect words away from its related opinion words in a dependency tree. Moreover, few models incorporate external affective knowledge for ABSA. Based on this, we propose a novel architecture to tackle the above two limitations, while fills up the gap in applying heterogeneous graphs convolution network to ABSA. Specially, we employ affective knowledge as an sentiment node to augment the representation of words. Then, linking sentiment node which have different attributes with word node through a specific edge to form a heterogeneous graph based on dependency tree. Finally, we design a multi-level semantic heterogeneous graph convolution network (Semantic-HGCN) to encode the heterogeneous graph for sentiment prediction. Extensive experiments are conducted on the datasets SemEval 2014 Task 4, SemEval 2015 task 12, SemEval 2016 task 5 and ACL 14 Twitter. The experimental results show that our method achieves the state-of-the-art performance.

Keywords heterogeneous graph convolution network      multi-head attention network      aspect-based sentiment analysis      deep learning      affective knowledge     
Corresponding Author(s): Zhixin LI   
Just Accepted Date: 12 October 2022   Issue Date: 17 January 2023
 Cite this article:   
Yufei ZENG,Zhixin LI,Zhenbin CHEN, et al. Aspect-level sentiment analysis based on semantic heterogeneous graph convolutional network[J]. Front. Comput. Sci., 2023, 17(6): 176340.
 URL:  
https://academic.hep.com.cn/fcs/EN/10.1007/s11704-022-2256-5
https://academic.hep.com.cn/fcs/EN/Y2023/V17/I6/176340
Fig.1  Overall Framework, where Att represents the computation of attention, HGCN represents several layers of heterogeneous graph convolutional neural network, and hasp represents the aspect words extracted from the text sequence
Fig.2  A sketch of SenticNet 5 semantic network of three-level knowledge representation
Concepts Polarity_label Polarity_value Semantics ...
Abandonment Negative −0.82 Weary, need_break... ...
Bleed Negative −0.75 Hurt, suffer... ...
Declared Positive 0.726 Stated,explicit... ...
Horrible Negative −0.93 Dread, distressed... ...
Tab.1  Example of Senticnet 5 commonsense concepts with affective
Fig.3  Construct semantic heterogeneous graph, in which the edges between word nodes and between word nodes and sentiment nodes are bidirectional in the isomorphic and heterogeneous graph
Symbols Specific meaning of the symbols
O Random matrix when SenticNet 5 performs mapping
X Random matrix provided by SenticNet 5
S Rn ×de The embedded text matrix
Ss Rn×ds Sentiment matrix corresponding to the text
e Rde Embedding vector of words
es Rds Sentiment embedding vector of words
a,d Relevant dimensions
w Words in sentences
n,t Length of sentences, length of aspect words
b Constants associated with the loss function
h Hidden states in neural networks
h a Vector after aspect word aggregation
h a sp Global representation of aspect-specific words
g Representation of Heterogeneous graphs
g a sp Graph representation after interaction with aspect words
α,β Normalized attention coefficient
W,b Parameter matrix and bias
ϕ ,σ Activation function
The product of element levels
L Position marking of words in sentences
Tab.2  Symbols and their corresponding meanings.
Fig.4  Example of bipartite graph
Fig.5  We use the aspect masking layer to obtain specific aspect words. The right side of the figure is the source code that implements this layer
Dataset Training Testing All
Pos Neg Neu Pos Neg Neu
Twitter 1561 3127 1560 173 346 173 6940
Laptop 994 464 870 341 169 128 2966
Rest14 2164 637 807 728 196 196 4728
Rest15 912 256 36 326 182 34 1746
Rest16 1657 1528 3016 172 169 336 6728
Tab.3  Statistics of samples by class labels on benchmark dataset
Model Rest14 Laptop Twitter Rest15 Rest16
Acc Mac-F1 Acc Mac-F Acc Mac-F1 Acc Mac-F1 Acc Mac-F1
LSTM+SynATT [50] 80.45 71.26 72.57 69.13
ASGCN [7] 80.77 72.02 75.55 71.05 72.15 70.40 79.89 61.89 86.24 67.62
CDT [23] 82.30 74.02 77.19 72.99 74.66 73.66 70.92 61.68 86.24 67.62
GAT [24] 78.21 67.17 73.04 68.11 71.67 70.13
TD-GAT [25] 80.35 76.13 74.13 72.01 72.68 71.15 80.38 60.50 87.71 67.87
BiGCN [10] 81.97 73.48 74.59 71.84 74.16 73.35 81.16 64.79 88.96 70.84
KGCapsAN [35] 82.05 74.04 76.96 72.89 74.13 72.52 81.86 65.60 88.47 70.72
ATAE-LSTM [3] 77.20 68.70 75.2 64.1 82.1 64.4
IAN [12] 78.60 72.10 75.5 63.9 83.6 65.2
RAM [5] 80.23 70.80 74.49 71.35 69.36 67.30 76.7 64.5 83.9 66.1
MGAN [13] 81.25 71.94 75.39 72.47 72.54 70.81
LSTM [51] 79.10 69.00 71.22 65.75 69.51 67.98 77.37 55.17 86.80 63.88
SIOT-Bi-GRU [52] 82.05 72.53 77.11 73.28 74.56 73.52
R-GAT [26] 83.30 76.08 77.42 73.76 75.57 73.82
TM [53] 78.02 67.85 73.51 70.80
MCRF-SA [19] 82.86 73.78 77.64 74.23 80.82 61.59 89.51 75.72
TNET [54] 80.69 71.27 76.54 71.75 74.90 73.60
PWCN [17] 80.89 72.16 75.86 71.94 72.10 70.75
RGAT-FT-RoBERTa [55] 82.76 75.25 77.43 74.21 75.43 74.04
Semantic-HGCN 84.27 77.16 78.29 74.78 75.78 73.68 82.31 66.46 90.12 72.6
Tab.4  Comparision results for all methods on the five datasets, where Acc means Accuracy, Mac-F1 means Macro-F1
Fold Rest14 Laptop Twitter
Acc Mac-F1 Acc Mac-F1 Acc-2 Mac-F1
1 84.456 77.204 78.308 75.077 76.08 74.003
2 84.095 76.914 77.901 74.835 75.415 73.434
3 84.175 77.109 78.112 74.924 75.764 73.727
4 84.318 77.379 78.376 75.098 76.195 74.122
5 84.126 77.034 78.263 74.648 75.751 73.51
ave 84.23 77.13 78.19 74.92 75.84 73.76
std 0.151 0.176 0.19 0.185 0.307 0.29
Tab.5  Robustness evaluation of Semantic-HGCN in three datasets, where ave denotes the mean of each index in 5 experiments, and s td denotes the standard deviation
Fig.6  Visualization of robustness evaluation of Semantic-HGCN. It should be noted that in order to facilitate the integration of multiple data into a line chart, the Acc-Rest14 needs to look at the vertical coordinates on the right side of the line chart
Model Rest14 Laptop Twitter
Acc Mac-F1 Acc Mac-F1 Acc Mac-F1
Semantic-HGCN 84.27 77.16 78.29 74.78 75.78 73.68
no: atten 83.24 76.83 77.39 73.08 73.76 71.57
↓1.03 ↓0.33 ↓0.90 ↓1.70 ↓2.02 ↓2.11
no: sentiment 82.02 73.33 73.63 70.25 72.17 70.52
↓2.25 ↓3.83 ↓4.66 ↓4.53 ↓3.61 ↓3.16
Dot_attention 80.19 68.50 73.27 70.21 72.21 70.46
↓4.08 ↓8.66 ↓5.02 ↓4.57 ↓3.57 ↓3.22
Tab.6  Ablation study on the three datasets, where ↓ represents the degradation of performance compared with Semantic-HGCN
Fig.7  Example of dependency trees with emotional information. (a) aspect: garlic knots, label: positive; (b) aspect:roasted chickens, label: positive; (c) aspect: atmosphere, label: positive
Parser Performance Datasets
USA LAS Rest14 Laptop Twitter
Acc Mac-F1 Acc Mac-F1 Acc Mac-F1
Standford 94.24 91.37 83.99 76.83 77.64 74.12 75.30 73.06
Biaffine 95.41 93.64 84.27 77.16 78.29 74.78 75.78 73.68
Tab.7  Results of Semantic-HGCN based on two parsers(Dependency Tree Generators), where UAS and LAS are metrics to evaluate the parsers and higher scores mean better performance. The experimental setup is all the same, except for the parser
Fig.8  Impacts of the layer number l. (a) Acc value changes with increasing of layer; (b) F1 value changes with increasing of layer
Parameters Dataset
Batch_size Dropout Rest14 Laptop Twitter
Acc Mac-F1 Acc Mac-F1 Acc-2 Mac-F1
0.8 84.27 77.16 78.29 74.78 75.78 73.68
0.5 84.11 77.03 78.08 74.39 75.76 73.57
0.2 84.09 76.99 78.07 74.19 75.63 73.42
8 83.81 76.84 77.21 74.08 75.04 73.18
16 84.27 77.16 78.29 74.78 75.78 73.68
32 84.03 76.98 78.11 74.59 75.58 73.25
Tab.8  Validation study of parameter Batch_size and Dropout.
  
  
  
  
1 T H, Nguyen K Shirai . PhraseRNN: phrase recursive neural network for aspect-based sentiment analysis. In: Proceedings of 2015 Conference on Empirical Methods in Natural Language Processing. 2015, 2509−2514
2 D, Tang B, Qin T Liu . Aspect level sentiment classification with deep memory network. In: Proceedings of 2016 Conference on Empirical Methods in Natural Language Processing. 2016, 214−224
3 Y, Wang M, Huang X, Zhu L Zhao . Attention-based LSTM for aspect-level sentiment classification. In: Proceedings of 2016 Conference on Empirical Methods in Natural Language Processing. 2016, 606−615
4 D, Tang B, Qin X, Feng T Liu . Effective LSTMs for target-dependent sentiment classification. In: Proceedings of the 26th International Conference on Computational Linguistics. 2016, 3298−3307
5 P, Chen Z, Sun L, Bing W Yang . Recurrent attention network on memory for aspect sentiment analysis. In: Proceedings of 2017 Conference on Empirical Methods in Natural Language Processing. 2017, 452−461
6 Y, Zhang P, Qi C D Manning . Graph convolution over pruned dependency trees improves relation extraction. In: Proceedings of 2018 Conference on Empirical Methods in Natural Language Processing. 2018, 2205−2215
7 C, Zhang Q, Li D Song . Aspect-based sentiment classification with aspect-specific graph convolutional networks. In: Proceedings of 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing. 2019, 4568−4578
8 Z, Li Y, Sun J, Zhu S, Tang C, Zhang H Ma . Improve relation extraction with dual attention-guided graph convolutional networks. Neural Computing and Applications, 2021, 33( 6): 1773–1784
9 S, Chen Z, Li F, Huang C, Zhang H Ma . Improving object detection with relation mining network. In: Proceedings of 2020 IEEE International Conference on Data Mining. 2020, 52−61
10 M, Zhang T Qian . Convolution over hierarchical syntactic and lexical graphs for aspect level sentiment analysis. In: Proceedings of 2020 Conference on Empirical Methods in Natural Language Processing. 2020, 3540−3549
11 Cambria E, Poria S, Hazarika D, Kwok K. SenticNet 5: discovering conceptual primitives for sentiment analysis by means of context embeddings. In: Proceedings of the 32nd AAAI Conference on Artificial Intelligence and 30th Innovative Applications of Artificial Intelligence Conference and 8th AAAI Symposium on Educational Advances in Artificial Intelligence. 2018, 219
12 Ma D, Li S, Zhang X, Wang H. Interactive attention networks for aspect-level sentiment classification. In: Proceedings of the 26th International Joint Conference on Artificial Intelligence. 2017, 4068−4074
13 F, Fan Y, Feng D Zhao . Multi-grained attention network for aspect-level sentiment classification. In: Proceedings of 2018 Conference on Empirical Methods in Natural Language Processing. 2018, 3433−3442
14 W, Xue T Li . Aspect based sentiment analysis with gated convolutional networks. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics. 2018, 2514−2523
15 Tay Y, Tuan L A, Hui S C. Learning to attend via word-aspect associative fusion for aspect-based sentiment analysis. In: Proceedings of the 32nd AAAI Conference on Artificial Intelligence and 30th Innovative Applications of Artificial Intelligence Conference and 8th AAAI Symposium on Educational Advances in Artificial Intelligence. 2018, 731
16 Yao L, Mao C, Luo Y. Graph convolutional networks for text classification. In: Proceedings of the 33rd AAAI Conference on Artificial Intelligence and 31st Innovative Applications of Artificial Intelligence Conference and 9th AAAI Symposium on Educational Advances in Artificial Intelligence. 2019, 905
17 C, Zhang Q, Li D Song . Syntax-aware aspect-level sentiment classification with proximity-weighted convolution network. In: Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval. 2019, 1145−1148
18 M, Hu S, Zhao H, Guo R, Cheng Z Su . Learning to detect opinion snippet for aspect-based sentiment analysis. In: Proceedings of the 23rd Conference on Computational Natural Language Learning. 2019, 970−979
19 Xu L, Bing L, Lu W, Huang F. Aspect sentiment classification with aspect-specific opinion spans. In: Proceedings of 2020 Conference on Empirical Methods in Natural Language Processing. 2020, 3561−3567
20 Y, Wang Q, Chen J, Shen B, Hou M, Ahmed Z Li . Aspect-level sentiment analysis based on gradual machine learning. Knowledge-Based Systems, 2021, 212: 106509
21 Z, Zhang C W, Hang M P Singh . Octa: omissions and conflicts in target-aspect sentiment analysis. In: Proceedings of the Findings of the Association for Computational Linguistics. 2020, 1651−1662
22 H, Cai V W, Zheng K C C Chang . A comprehensive survey of graph embedding: problems, techniques, and applications. IEEE Transactions on Knowledge and Data Engineering, 2018, 30( 9): 1616–1637
23 K, Sun R, Zhang S, Mensah Y, Mao X Liu . Aspect-level sentiment analysis via convolution over dependency tree. In: Proceedings of 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing. 2019, 5679−5688
24 P, Veličković G, Cucurull A, Casanova A, Romero P, Liò Y Bengio . Graph attention networks. In: Proceedings of the ICLR 2018. 2018
25 B, Huang K Carley . Syntax-aware aspect level sentiment classification with graph attention networks. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing. 2019, 5469−5477
26 K, Wang W, Shen Y, Yang X, Quan R Wang . Relational graph attention network for aspect-based sentiment analysis. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. 2020, 3229−3238
27 L, Ratinov D Roth . Design challenges and misconceptions in named entity recognition. In: Proceedings of the 30th Conference on Computational Natural Language Learning. 2009, 147−155
28 A, Rahman V Ng . Coreference resolution with world knowledge. In: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies. 2011, 814−824
29 N, Nakashole T M Mitchell . A knowledge-intensive model for prepositional phrase attachment. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing. 2015, 365−375
30 Z, Xu B, Liu B, Wang C, Sun X Wang . Incorporating loose-structured knowledge into LSTM with recall gate for conversation modeling. 2016, arXiv preprint arXiv: 1605.05110
31 B, Zhang X, Xu M, Yang X, Chen Y Ye . Cross-domain sentiment classification by capsule network with semantic rules. IEEE Access, 2018, 6: 58284–58294
32 J, Zhang P, Lertvittayakumjorn Y Guo . Integrating semantic knowledge to tackle zero-shot text classification. In: Proceedings of NAACL-HLT 2019. 2019, 1031−1040
33 Z, Hu X, Ma Z, Liu E, Hovy E P Xing . Harnessing deep neural networks with logic rules. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. 2016, 2410−2420
34 M, Dragoni G Petrucci . A fuzzy-based strategy for multi-domain sentiment analysis. International Journal of Approximate Reasoning, 2018, 93: 59–73
35 B, Zhang X, Li X, Xu K C, Leung Z, Chen Y Ye . Knowledge guided capsule attention network for aspect-based sentiment analysis. IEEE/ACM Transactions on Audio, Speech, and Language Processing, 2020, 28: 2538–2551
36 Ma Y, Peng H, Cambria E. Targeted aspect-based sentiment analysis via embedding commonsense knowledge into an attentive LSTM. In: Proceedings of the 32nd AAAI Conference on Artificial Intelligence and 30th Innovative Applications of Artificial Intelligence Conference and 8th AAAI Symposium on Educational Advances in Artificial Intelligence. 2018, 721
37 E, Cambria S, Poria R, Bajpai B Schuller . SenticNet 4: a semantic resource for sentiment analysis based on conceptual primitives. In: Proceedings of the 26th International Conference on Computational Linguistics. 2016, 2666−2677
38 B, Zeng H, Yang R, Xu W, Zhou X Han . LCF: a local context focus mechanism for aspect-based sentiment classification. Applied Sciences, 2019, 9( 16): 3389
39 E, Cambria J, Fu F, Bisio S Poria . AffectiveSpace 2: enabling affective intuition for concept-level sentiment analysis. In: Proceedings of the 29th AAAI Conference on Artificial Intelligence. 2015, 508−514
40 E, Bingham H Mannila . Random projection in dimensionality reduction: applications to image and text data. In: Proceedings of the 7th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 2001, 245−250
41 T N, Kipf M Welling . Semi-supervised classification with graph convolutional networks. In: Proceedings of the 5th International Conference on Learning Representations. 2017
42 Y, Bengio R, Ducharme P, Vincent C Janvin . A neural probabilistic language model. The Journal of Machine Learning Research, 2003, 3: 1137–1155
43 L, Dong F, Wei C, Tan D, Tang M, Zhou K Xu . Adaptive recursive neural network for target-dependent twitter sentiment classification. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics. 2014, 49−54
44 Kirange D, Deshmukh R R, Kirange M. Aspect based sentiment analysis semeval-2014 task 4. Asian Journal of Computer Science and Information Technology, 2014, 4(8): 72−75
45 M, Pontiki D, Galanis H, Papageorgiou S, Manandhar I Androutsopoulos . SemEval-2015 task 12: aspect based sentiment analysis. In: Proceedings of the 9th International Workshop on Semantic Evaluation. 2015, 486−495
46 M, Pontiki D, Galanis H, Papageorgiou I, Androutsopoulos S, Manandhar M, Al-Smadi M, Al-Ayyoub Y, Zhao B, Qin Clercq O, De V, Hoste M, Apidianaki X, Tannier N, Loukachevitch E, Kotelnikov N, Bel S M, Jiménez-Zafra G Eryiğit . SemEval-2016 task 5: aspect based sentiment analysis. In: Proceedings of the 10th International Workshop on Semantic Evaluation. 2016, 19−30
47 T, Dozat C D Manning . Deep biaffine attention for neural dependency parsing. In: Proceedings of the 5th International Conference on Learning Representations. 2017
48 J, Pennington R, Socher C Manning . GloVe: global vectors for word representation. In: Proceedings of 2014 Conference on Empirical Methods in Natural Language Processing. 2014, 1532−1543
49 D P, Kingma J Ba . Adam: a method for stochastic optimization. In: Proceedings of the 3rd International Conference on Learning Representations. 2015
50 R, He W S, Lee H T, Ng D Dahlmeier . Effective attention modeling for aspect-level sentiment classification. In: Proceedings of the 27th International Conference on Computational Linguistics. 2018, 1121−1131
51 S, Hochreiter J Schmidhuber . Long short-term memory. Neural Computation, 1997, 9( 8): 1735–1780
52 W, Ali Y, Yang X, Qiu Y, Ke Y Wang . Aspect-level sentiment analysis based on bidirectional-GRU in SIoT. IEEE Access, 2021, 9: 69938–69950
53 R K, Yadav L, Jiao O C, Granmo M Goodwin . Human-level interpretable learning for aspect-based sentiment analysis. In: Proceedings of the 35th AAAI Conference on Artificial Intelligence. 2021, 14203−14212
54 X, Li L, Bing W, Lam B Shi . Transformation networks for target-oriented sentiment classification. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics. 2018, 946−956
55 J, Dai H, Yan T, Sun P, Liu X Qiu . Does syntax matter? A strong baseline for aspect-based sentiment analysis with RoBERTa. In: Proceedings of 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 2021, 1816−1829
56 Chen D, Manning C D. A fast and accurate dependency parser using neural networks. In: Proceedings of 2014 Conference on Empirical Methods in Natural Language Processing. 2014, 740−750
57 K, Xu C, Li Y, Tian T, Sonobe K I, Kawarabayashi S Jegelka . Representation learning on graphs with jumping knowledge networks. In: Proceedings of the 35th International Conference on Machine Learning. 2018, 5449−5458
[1] FCS-22256-OF-YZ_suppl_1 Download
[1] Lingling ZHAO, Shitao SONG, Pengyan WANG, Chunyu WANG, Junjie WANG, Maozu GUO. A MLP-Mixer and mixture of expert model for remaining useful life prediction of lithium-ion batteries[J]. Front. Comput. Sci., 2024, 18(5): 185329-.
[2] Enes DEDEOGLU, Himmet Toprak KESGIN, Mehmet Fatih AMASYALI. A robust optimization method for label noisy datasets based on adaptive threshold: Adaptive-k[J]. Front. Comput. Sci., 2024, 18(4): 184315-.
[3] Hengyu LIU, Tiancheng ZHANG, Fan LI, Minghe YU, Ge YU. A probabilistic generative model for tracking multi-knowledge concept mastery probability[J]. Front. Comput. Sci., 2024, 18(3): 183602-.
[4] Mingzhi YUAN, Kexue FU, Zhihao LI, Manning WANG. Decoupled deep hough voting for point cloud registration[J]. Front. Comput. Sci., 2024, 18(2): 182703-.
[5] Mingzhen LI, Changxi LIU, Jianjin LIAO, Xuegui ZHENG, Hailong YANG, Rujun SUN, Jun XU, Lin GAN, Guangwen YANG, Zhongzhi LUAN, Depei QIAN. Towards optimized tensor code generation for deep learning on sunway many-core processor[J]. Front. Comput. Sci., 2024, 18(2): 182101-.
[6] Hanadi AL-MEKHLAFI, Shiguang LIU. Single image super-resolution: a comprehensive review and recent insight[J]. Front. Comput. Sci., 2024, 18(1): 181702-.
[7] Yamin HU, Hao JIANG, Zongyao HU. Measuring code maintainability with deep neural networks[J]. Front. Comput. Sci., 2023, 17(6): 176214-.
[8] Muazzam MAQSOOD, Sadaf YASMIN, Saira GILLANI, Maryam BUKHARI, Seungmin RHO, Sang-Soo YEO. An efficient deep learning-assisted person re-identification solution for intelligent video surveillance in smart cities[J]. Front. Comput. Sci., 2023, 17(4): 174329-.
[9] Tian WANG, Jiakun LI, Huai-Ning WU, Ce LI, Hichem SNOUSSI, Yang WU. ResLNet: deep residual LSTM network with longer input for action recognition[J]. Front. Comput. Sci., 2022, 16(6): 166334-.
[10] Yi WEI, Mei XUE, Xin LIU, Pengxiang XU. Data fusing and joint training for learning with noisy labels[J]. Front. Comput. Sci., 2022, 16(6): 166338-.
[11] Donghong HAN, Yanru KONG, Jiayi HAN, Guoren WANG. A survey of music emotion recognition[J]. Front. Comput. Sci., 2022, 16(6): 166335-.
[12] Pinzhuo TIAN, Yang GAO. Improving meta-learning model via meta-contrastive loss[J]. Front. Comput. Sci., 2022, 16(5): 165331-.
[13] Tian WANG, Shiye LEI, Youyou JIANG, Choi CHANG, Hichem SNOUSSI, Guangcun SHAN, Yao FU. Accelerating temporal action proposal generation via high performance computing[J]. Front. Comput. Sci., 2022, 16(4): 164317-.
[14] Kaimin WEI, Tianqi LI, Feiran HUANG, Jinpeng CHEN, Zefan HE. Cancer classification with data augmentation based on generative adversarial networks[J]. Front. Comput. Sci., 2022, 16(2): 162601-.
[15] Yu OU, Lang LI. Side-channel analysis attacks based on deep learning network[J]. Front. Comput. Sci., 2022, 16(2): 162303-.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed