Please wait a minute...
Frontiers of Computer Science

ISSN 2095-2228

ISSN 2095-2236(Online)

CN 10-1014/TP

Postal Subscription Code 80-970

2018 Impact Factor: 1.129

Front. Comput. Sci.    2020, Vol. 14 Issue (6) : 146323    https://doi.org/10.1007/s11704-020-9370-z
RESEARCH ARTICLE
Estimating posterior inference quality of the relational infinite latent feature model for overlapping community detection
Qianchen YU1,2(), Zhiwen YU1,2, Zhu WANG1,2, Xiaofeng WANG3, Yongzhi WANG4
1. School of Computer Science and Engineering, Northwestern Polytechnical University, Xi’an 710072, China
2. Shanxi Provincial Key Laboratory for Embedded System, Xi’an 710072, China
3. College of Computer Science and Engineering, North Minzu University, Yinchuan 750021, China
4. College of Earth Exploration Science and Technology, Jilin University, Changchun 130025, China
 Download: PDF(1213 KB)  
 Export: BibTeX | EndNote | Reference Manager | ProCite | RefWorks
Abstract

Overlapping community detection has become a very hot research topic in recent decades, and a plethora of methods have been proposed. But, a common challenge in many existing overlapping community detection approaches is that the number of communities K must be predefinedmanually. We propose a flexible nonparametric Bayesian generative model for count-value networks, which can allow K to increase as more and more data are encountered instead of to be fixed in advance. The Indian buffet process was used to model the community assignment matrix Z, and an uncollapsed Gibbs sampler has been derived.However, as the community assignment matrix Z is a structured multi-variable parameter, how to summarize the posterior inference results and estimate the inference quality about Z, is still a considerable challenge in the literature. In this paper, a graph convolutional neural network based graph classifier was utilized to help to summarize the results and to estimate the inference quality about Z. We conduct extensive experiments on synthetic data and real data, and find that empirically, the traditional posterior summarization strategy is reliable.

Keywords graph convolutional neural network      graph classification      overlapping community detection      nonparametric Bayesian generative model      relational infinite latent feature model      Indian buffet process      uncollapsed Gibbs sampler      posterior inference quality estimation     
Corresponding Author(s): Qianchen YU   
Just Accepted Date: 19 April 2020   Issue Date: 20 July 2020
 Cite this article:   
Qianchen YU,Zhiwen YU,Zhu WANG, et al. Estimating posterior inference quality of the relational infinite latent feature model for overlapping community detection[J]. Front. Comput. Sci., 2020, 14(6): 146323.
 URL:  
https://academic.hep.com.cn/fcs/EN/10.1007/s11704-020-9370-z
https://academic.hep.com.cn/fcs/EN/Y2020/V14/I6/146323
1 N Lu, W Luo, L Ni, H Jiang, W Ding. Extending CDFR for overlapping community detection. In: Proceedings of the 1st International Conference on Data Intelligence and Security. 2018, 200–206
https://doi.org/10.1109/ICDIS.2018.00040
2 W Luo, Z Yan, C Bu, D Zhang. Community detection by fuzzy relations. IEEE Transactions on Emerging Topics in Computing, 2017, 1(9): 125–134
3 L Wasserman. Asymptotic properties of nonparametric bayesian procedures. In: Dey D, Müller P, Sinha D, eds. Practical Nonparametric Semiparametric Bayesian Statistics, Springer, New York. 1998
https://doi.org/10.1007/978-1-4612-1732-9_16
4 B Anders. Bayesian data analysis. Journal of the Royal Statistical Society, 2005, 168(1): 251–252
https://doi.org/10.1111/j.1467-985X.2004.00347_4.x
5 B Weisfeiler, A Lehman. A reduction of a graph to a canonical form and an algebra arising during this reduction. Nauchno-Techniches Kaya Informatsia, 1968, 9(2): 12–16
6 K Xu, W Hu, J Leskovec, S Jegelka. How powerful are graph neural networks? In: Proceedings of International Conference for Learning Representations. 2019
7 Q Li, Z Han, X M Wu. Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the 33rd National Conference on Artificial Intelligence. 2018
8 M Zhang, Z Cui, M Neumann, Y Chen. An end-to-end deep learning architecture for graph classification. In: Proceedings of the 32nd AAAI Conference on Artificial Intelligence. 2018, 4438–4445
9 E M Airoldi, D M Blei, S E Fienberg, E P Xing. Mixedmembership stochastic blockmodels. Journal of Machine Learning Research, 2008, 9(5): 1981–1992
10 G Tom, G Zoubin. Infinite latent feature models, the indian buffet process. In: Proceedings of International Conference on Neural Information Processing Systems. 2005
11 P W Holland, B L Kathryn, L Samuel. Stochastic blockmodels: first steps. Social Networks, 1983, 5(2): 109–137
https://doi.org/10.1016/0378-8733(83)90021-7
12 C Kemp, J B Tenenbaum, T L Griffiths, T Yamada, N Ueda. Learning systems of concepts with an infinite relational model. In: Proceedings of the 21st National Conference on Artificial Intelligence. 2006, 381–388
13 B Karrer, M E Newman. Stochastic blockmodels, community structure in networks. Physical Review E Stat, 2011, 83(2): 96–107
https://doi.org/10.1103/PhysRevE.83.016107
14 K S Xu, A O H Iii. Dynamic stochastic blockmodels: statistical models for time-evolving networks. In: Proceedings of International Conference on Social Computing, Behavioral-Cultural Modeling, Prediction. 2013
https://doi.org/10.1007/978-3-642-37210-0_22
15 M B David, L Gand Thomas, I J Michael. The nested chinese restaurant process, bayesian nonparametric inference. Advances in Neural Information Processing Systems, 2007, 16(2): 17–24
16 L Miller, K Tadayuki. Bayesian nonparametric latent feature models. Dissertations, Theses–Gradworks. 2011, 201–226
17 M Morup, M N Schmidt, L K Hansen. Infinite multiple membership relational modeling for complex networks. In: Proceedings of IEEE International Workshop on Machine Learning for Signal Processing. 2011
https://doi.org/10.1109/MLSP.2011.6064546
18 K Palla, A K David, Z Ghahramani. Relational learning, network modelling using infinite latent attribute models. IEEE Transactions on Pattern Analysis, Machine Intelligence, 2015, 37(2): 462–474
https://doi.org/10.1109/TPAMI.2014.2324586
19 T Herlau, M N Schmidt, M Morup. Infinite-degree-corrected stochastic block model. Physical Review E Statistical Nonlinear, Soft Matter Physics, 2014, 90(3): 328–331
https://doi.org/10.1103/PhysRevE.90.032819
20 C Wang, M B David. Variational inference for the nested chinese restaurant process. In: Proceedings of the 22nd International Conference on Neural Information Processing Systems. 2009, 1990–1998
21 R Thibaux, I J Michael. Hierarchical beta processes, the indian buffet process. In: Proceedings of the 11th International Conference on Artificial Intelligence, Statistics. 2007
22 D Persi. Finite forms of de finetti’s theorem on exchangeability. Synthese, 1977, 36(2): 271–281
https://doi.org/10.1007/BF00486116
23 T Broderick, M I Jordan, J Pitman. Beta processes, stick-breaking, and power laws. Bayesian Analysis, 2012, 7(2): 439–476
https://doi.org/10.1214/12-BA715
24 C K Heaukulani. Generalized IBPs, random multisets, tree-structured feature allocations. University of Cambridge, Dissertations, 2016, 201–226
25 B Tamara, I J Michael, P Jim. Cluster, feature modeling from combinatorial stochastic processes. Statistical Science, 2013, 28(3): 289–312
https://doi.org/10.1214/13-STS434
26 M Zhou, L Hannah, D Dunson. Beta-negative binomial process and poisson factor analysis. In: Proceedings of International Conference on Artificial Intelligence and Statistics. 2011, 1462–1471
27 A Lijoi, I Pruenster. Models beyond the Dirichlet process. SSRN Electronic Journal, 2009, 23–49
https://doi.org/10.2139/ssrn.1526505
28 P D Blasi, S Favaro, A Lijoi, R H Mena, R Prunster, M Ruggiero. Are gibbs-type priors the most natural generalization of the Dirichlet process? IEEE Transactions on Pattern Analysis, Machine Intelligence, 2015, 37(2): 212–229
https://doi.org/10.1109/TPAMI.2013.217
29 A B Gelman, J B Carlin, H S Stern, D B Dunson, A Vehtari, D B Rubin. Bayesian data analysis. Wiley Interdisciplinary Reviews Cognitive Science, 2014, 1(5): 658–676
https://doi.org/10.1201/b16018
30 O Martin. Bayesian Analysis with Python. Packt Publishing. 2016
31 The PyMC Development Team. Pymc3 3.6 documentation. 2018
32 D V Finale, G Zoubin. Accelerated gibbs sampling for the indian buffet process. In: Proceedings of International Conference on Machine Learning. 2009
33 H Y Tung, C Y Wu, Z Manzil, J S Alexander. Spectral methods for nonparametric models. In: Proceedings of the 28th International Conference on Neural Information Processing Systems. 2017
34 A C Tsoi, F Scarselli, M Gori. A neural network approach to web graph processing. In: Proceedings of Asia-pacific Web Conference on Web Technologies Research, Development. 2005
https://doi.org/10.1007/978-3-540-31849-1_4
35 F Scarselli, M Gori, C T Ah, M Hagenbuchner, G Monfardini. The graph neural network model. IEEE Transactions on Neural Networks, 2009, 20(1): 61–79
https://doi.org/10.1109/TNN.2008.2005605
36 J Bruna, S Mallat. Invariant scattering convolution networks. IEEE Transactions on Pattern Analysis, Machine Intelligence, 2013, 35(8): 1872–1886
https://doi.org/10.1109/TPAMI.2012.230
37 H Mikael, B Joan, L Yann. Deep convolutional networks on graphstructured data. Computer Science. 2015
38 D David, M Dougal, A I Jorge, G B Rafael, H Timothy, A G Alan, P A Ryan. Convolutional networks on graphs for learning molecular fingerprints. In: Proceedings of International Conference on Neural Information Processing Systems. 2015
39 G Justin, S Samuel, S Riley, F Patrick. Neural message passing for quantum chemistry. In: Proceedings of the International Conference on Machine Learning. 2017
40 D Michael, B Xavier, V Pierre. Convolutional neural networks on graphs with fast localized spectral filtering. In: Proceedings of the 30th International Conference on Neural Information Processing Systems. 2016, 3844–3852
41 L Ron, M Federico, B Xavier, M B Michael. Cayleynets: graph convolutional neural networks with complex rational spectral filters. IEEE Transactions on Signal Processing, 2019, 67(1): 97–109
https://doi.org/10.1109/TSP.2018.2879624
42 N K Thomas, W Max. Semi-supervised classification with graph convolutional networks. In: Proceedings of the International Conference on Learning Representations. 2016
43 W B Peter, B H Jessica, B Victor, S G Alvaro, Z Vinicius, et al. Relational inductive biases, deep learning, graph networks. In: Proceedings of the International Conference on Learning Representations. 2018
44 J Zhou, G Cui, Z Zhang, Y Cheng, Z Y Liu, M S Sun. Graph neural networks: a review of methods, applications. IEEE Transactions on Knowledge and Data Engineering. 2018
45 Z Zhang, P Cui, W Zhu. Deep learning on graphs: a survey. IEEE Transactions on Knowledge and Data Engineering. 2018
46 Z Wu, S Pan, F Chen, G Long, C Zhang, P S Yu. A comprehensive survey on graph neural networks. IEEE Transactions on Knowledge and Data Engineering. 2019
https://doi.org/10.1109/TNNLS.2020.2978386
47 H Dai, B Dai, L Song. Discriminative embeddings of latent variable models for structured data. In: Proceedings of the International Conference on Machine Learning. 2016
48 F Scarselli, M Gori, A C Tsoi, M Hagenbuchner, G Monfardini. Computational capabilities of graph neural networks. IEEE Transactions on Neural Networks, 2009, 20(1): 81–102
https://doi.org/10.1109/TNN.2008.2005141
49 C Morris, K Kersting, P Mutzel. Global-local feature maps of graphs. In: Proceedings of the IEEE International Conference on Data Mining. 2017
50 S Nino, S Pascal, J Erik, L Van, M Kurt, M B Karsten. Weisfeilerlehman graph kernels. Journal of Machine Learning Research, 2011, 12(3): 2539–2561
51 R A Rossi, N K Ahmed. The network data repository with interactive graph analytics, visualization. In: Proceedings of the 29th AAAI Conference on Artificial Intelligence. 2015
52 M J Johnson, D Duvenaud, A B Wiltschko. Composing graphical models with neural networks for structured representations, fast inference. 2017, arXiv preprint arXiv:1603.06277
53 D P Kingma, M Welling. Auto-encoding variational bayes. In: Proceedings of the International Conference on Learning Representations. 2013
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed