Hi Jean,

the answer also depends how complicated the desired SBM is. A layered model takes longer than an unlayered one.

Modeling a graph with 100k nodes should take very long. But I'd also be interested in a more informed answer...

Haiko




Von: graph-tool [graph-tool-bounces@skewed.de]" im Auftrag von "Jean Christophe Cazes [cazes.jean.christophe@gmail.com]
Gesendet: Freitag, 14. Juni 2019 09:59
An: graph-tool@skewed.de
Betreff: [graph-tool] [SBM on Dense Graphs]

Hello, I intend to use graph_tool for a big network, +100k nodes and very dense.

The dataset i'm working with at the moment is ~ 40/50 GB csv containing vertices and edges as transactions.

Is it realistic to try SBM on such graph both computationnally and would this be theoretically useful?

If it isnt computationnally, how big can my subgraph be in order to be feasible?

Note: I will rent a Google Cloud Platform VM to do so.

Thank you