scholarly journals Consistent Adjacency-Spectral Partitioning for the Stochastic Block Model When the Model Parameters Are Unknown

2013 ◽  
Vol 34 (1) ◽  
pp. 23-39 ◽  
Author(s):  
Donniell E. Fishkind ◽  
Daniel L. Sussman ◽  
Minh Tang ◽  
Joshua T. Vogelstein ◽  
Carey E. Priebe
2020 ◽  
Vol 34 (04) ◽  
pp. 3641-3648 ◽  
Author(s):  
Eli Chien ◽  
Antonia Tulino ◽  
Jaime Llorca

The geometric block model is a recently proposed generative model for random graphs that is able to capture the inherent geometric properties of many community detection problems, providing more accurate characterizations of practical community structures compared with the popular stochastic block model. Galhotra et al. recently proposed a motif-counting algorithm for unsupervised community detection in the geometric block model that is proved to be near-optimal. They also characterized the regimes of the model parameters for which the proposed algorithm can achieve exact recovery. In this work, we initiate the study of active learning in the geometric block model. That is, we are interested in the problem of exactly recovering the community structure of random graphs following the geometric block model under arbitrary model parameters, by possibly querying the labels of a limited number of chosen nodes. We propose two active learning algorithms that combine the use of motif-counting with two different label query policies. Our main contribution is to show that sampling the labels of a vanishingly small fraction of nodes (sub-linear in the total number of nodes) is sufficient to achieve exact recovery in the regimes under which the state-of-the-art unsupervised method fails. We validate the superior performance of our algorithms via numerical simulations on both real and synthetic datasets.


Entropy ◽  
2021 ◽  
Vol 23 (1) ◽  
pp. 65
Author(s):  
Feng Zhao ◽  
Min Ye ◽  
Shao-Lun Huang

In this paper, we study the phase transition property of an Ising model defined on a special random graph—the stochastic block model (SBM). Based on the Ising model, we propose a stochastic estimator to achieve the exact recovery for the SBM. The stochastic algorithm can be transformed into an optimization problem, which includes the special case of maximum likelihood and maximum modularity. Additionally, we give an unbiased convergent estimator for the model parameters of the SBM, which can be computed in constant time. Finally, we use metropolis sampling to realize the stochastic estimator and verify the phase transition phenomenon thfough experiments.


2016 ◽  
Vol 114 (1) ◽  
pp. 33-38 ◽  
Author(s):  
Isabel M. Kloumann ◽  
Johan Ugander ◽  
Jon Kleinberg

Methods for ranking the importance of nodes in a network have a rich history in machine learning and across domains that analyze structured data. Recent work has evaluated these methods through the “seed set expansion problem”: given a subsetSof nodes from a community of interest in an underlying graph, can we reliably identify the rest of the community? We start from the observation that the most widely used techniques for this problem, personalized PageRank and heat kernel methods, operate in the space of “landing probabilities” of a random walk rooted at the seed set, ranking nodes according to weighted sums of landing probabilities of different length walks. Both schemes, however, lack an a priori relationship to the seed set objective. In this work, we develop a principled framework for evaluating ranking methods by studying seed set expansion applied to the stochastic block model. We derive the optimal gradient for separating the landing probabilities of two classes in a stochastic block model and find, surprisingly, that under reasonable assumptions the gradient is asymptotically equivalent to personalized PageRank for a specific choice of the PageRank parameterαthat depends on the block model parameters. This connection provides a formal motivation for the success of personalized PageRank in seed set expansion and node ranking generally. We use this connection to propose more advanced techniques incorporating higher moments of landing probabilities; our advanced methods exhibit greatly improved performance, despite being simple linear classification rules, and are even competitive with belief propagation.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Abeer A. Zaki ◽  
Nesma A. Saleh ◽  
Mahmoud A. Mahmoud

PurposeThis study aims to assess the effect of updating the Phase I data – to enhance the parameters' estimates – on the control charts' detection power designed to monitor social networks.Design/methodology/approachA dynamic version of the degree corrected stochastic block model (DCSBM) is used to model the network. Both the Shewhart and exponentially weighted moving average (EWMA) control charts are used to monitor the model parameters. A performance comparison is conducted for each chart when designed using both fixed and moving windows of networks.FindingsOur results show that continuously updating the parameters' estimates during the monitoring phase delays the Shewhart chart's detection of networks' anomalies; as compared to the fixed window approach. While the EWMA chart performance is either indifferent or worse, based on the updating technique, as compared to the fixed window approach. Generally, the EWMA chart performs uniformly better than the Shewhart chart for all shift sizes. We recommend the use of the EWMA chart when monitoring networks modeled with the DCSBM, with sufficiently small to moderate fixed window size to estimate the unknown model parameters.Originality/valueThis study shows that the excessive recommendations in literature regarding the continuous updating of Phase I data during the monitoring phase to enhance the control chart performance cannot generally be extended to social network monitoring; especially when using the DCSBM. That is to say, the effect of continuously updating the parameters' estimates highly depends on the nature of the process being monitored.


2014 ◽  
Vol 24 (11) ◽  
pp. 2699-2709 ◽  
Author(s):  
Bian-Fang CHAI ◽  
Jian YU ◽  
Cai-Yan JIA ◽  
Jing-Hong WANG

Sign in / Sign up

Export Citation Format

Share Document