Published online by Cambridge University Press: 18 October 2019
Statistical inference on graphs often proceeds via spectral methods involving low-dimensional embeddings of matrix-valued graph representations such as the graph Laplacian or adjacency matrix. In this paper, we analyze the asymptotic information-theoretic relative performance of Laplacian spectral embedding and adjacency spectral embedding for block assignment recovery in stochastic blockmodel graphs by way of Chernoff information. We investigate the relationship between spectral embedding performance and underlying network structure (e.g., homogeneity, affinity, core-periphery, and (un)balancedness) via a comprehensive treatment of the two-block stochastic blockmodel and the class of K-blockmodels exhibiting homogeneous balanced affinity structure. Our findings support the claim that, for a particular notion of sparsity, loosely speaking, “Laplacian spectral embedding favors relatively sparse graphs, whereas adjacency spectral embedding favors not-too-sparse graphs.” We also provide evidence in support of the claim that “adjacency spectral embedding favors core-periphery network structure.”
This work is partially supported by the XDATA and D3M programs of the Defense Advanced Research Projects Agency (DARPA) and by the Acheson J.Duncan Fund for the Advancement of Research in Statistics at JohnsHopkins University. Part of this work was done during visits by JC and CEP to the Isaac Newton Institute for Mathematical Sciences at the University of Cambridge under EPSCR grant EP/K032208/1. JC thanks Zachary Lubberts for productive discussions.