Revealing the microstructure of the giant component in random graph ensembles

Ido Tishby, Ofer Biham, Eytan Katzav, Reimer Kühn

Research output: Contribution to journalArticlepeer-review

20 Scopus citations

Abstract

The microstructure of the giant component of the Erdos-Rényi network and other configuration model networks is analyzed using generating function methods. While configuration model networks are uncorrelated, the giant component exhibits a degree distribution which is different from the overall degree distribution of the network and includes degree-degree correlations of all orders. We present exact analytical results for the degree distributions as well as higher-order degree-degree correlations on the giant components of configuration model networks. We show that the degree-degree correlations are essential for the integrity of the giant component, in the sense that the degree distribution alone cannot guarantee that it will consist of a single connected component. To demonstrate the importance and broad applicability of these results, we apply them to the study of the distribution of shortest path lengths on the giant component, percolation on the giant component, and spectra of sparse matrices defined on the giant component. We show that by using the degree distribution on the giant component one obtains high quality results for these properties, which can be further improved by taking the degree-degree correlations into account. This suggests that many existing methods, currently used for the analysis of the whole network, can be adapted in a straightforward fashion to yield results conditioned on the giant component.

Original languageAmerican English
Article number042318
JournalPhysical Review E
Volume97
Issue number4
DOIs
StatePublished - 25 Apr 2018

Bibliographical note

Publisher Copyright:
© 2018 American Physical Society.

Fingerprint

Dive into the research topics of 'Revealing the microstructure of the giant component in random graph ensembles'. Together they form a unique fingerprint.

Cite this