I have two graphs with ~65k vertices and ~26 million edges. The first one is without weights, and
the second is weighted (Vertices and edges are the same.) I have no problem generating the first
one. It takes about 2-3 minutes (I also generate edges, that's why it takes 2 minutes, I guess the
overall process of adding edges is merely a few seconds.) Anyway, since my weighted graph has the
same edges and vertices, with only weights for each edge, I don't want to generate edges from
scratch. Here's the snippet:

for e in graph.edges(): # graph is my "non-weighted" graph.
weight = get_weight(int(e.source()))
weighted_graph.add_edge(int(e.source()), int(e.target()))
weights[e] = weight

When I return the above function, it is reasonably fast, however the edge indices don't match with
those of graph (no weights), hence the weights are wrong. For example, the first edge of my graph is
different from weighted_graph, and the corresponding weighted_graph's weight actually belongs to
graph. Here is an example,

However the weight for weighted graph_edge_1 is actually graph edge_1's weight (probably as expected?).

Is there a way to fix this? Also, I need to generate weighted_graph for at least 5-10 times, each
with different weights (however the edges and vertices are remain the same, only weights change.) Is
there a faster/better way to do this? I later on use this weighted_graph in a shortest path problem
with negative_weights=True.

When I return the above function, it is reasonably fast, however the edge
indices don't match with those of graph (no weights), hence the weights are
wrong.

This is because you are using the edge descriptors of the wrong graph. What
you probably want to do is:

for e in graph.edges():
weight = get_weight(int(e.source()))
ne = weighted_graph.add_edge(int(e.source()), int(e.target()))
weights[ne] = weight

Also, I need to generate weighted_graph for at least 5-10 times, each with
different weights (however the edges and vertices are remain the same,
only weights change.) Is there a faster/better way to do this?

You can access the weights as a numpy array via:

weights.fa

It can be faster to just update this array, than to re-generate the whole
thing from scratch.

I also wonder if the performance differences between those two graphs
are expected to be much different, i.e., it takes 9 minutes to obtain
shortest path between 400,000 pairs in the normal graph, and it takes
52 hours for the weighted one.

Here is how I'm doing it:

path = []
for item in pairs:
vpath, epath = gt.shortest_path(weighted_graph,
weighted_graph.vertex(item[0]),
weighted_graph.vertex(item[1]),
weights=weights,
negative_weights=True)

Alıntı Tiago de Paula Peixoto <tiago(a)skewed.de>:

path = []
for item in pairs:
vpath, epath = gt.shortest_path(weighted_graph,
weighted_graph.vertex(item[0]),
weighted_graph.vertex(item[1]),
weights=weights,
negative_weights=True)
temp_path = [int(v) for v in vpath]
path.append(temp_path)

For unweighted graphs the algorithm is BFS which is O(V + E), and for
weighted graphs it's Dijkstra, which is O(V log V). However if negative
weights are used, the algorithm is Bellman-Ford, which is much slower, with
O(V * E).