Computation cost of creating a graph

Sorry for asking too much questions today about efficiency but it seems
today is not my lucky day. I'm trying to build a graph by reading from a
text file. The graph is 1034 vertices with 53498 edges. Creating this graph
takes 6 seconds, so I was wondering if it truly takes that time or
something is inefficient in my way of creating the graph. I have a text
file that contains the edges in the format: vertex_label_x vertex_label_y.

What I'm doing basically is maintaining a dictionary that maps from
vertices labels in the text file to graph-tool vertices indices. When I
read the file, I check if a label is in the dictionary, if so I retrieve
its index from the dictionary. Otherwise I create a new vertex and store
its index by the corresponding label in the dictionary. Then I create the
edge using the indices of the source and target. I was wondering if my way
is the wrong way to do that.

This is my code: http://pastie.org/private/n9j7geizaosafmlimujca

Many thanks in advance!

Best.

attachment.html (1.11 KB)

Hello,
if your primary concern is about speed/code efficiency, I strongly suggest
you to use profilers, either the ones listed in
http://docs.python.org/2/library/profile.html or something else. I
personally like kernprof, you can find out more here
http://www.huyng.com/posts/python-performance-analysis/

Best,
Giuseppe

2014-03-21 16:49 GMT+01:00 Hang Mang <gucko.gucko(a)googlemail.com>:

attachment.html (2.33 KB)

Are there ready functions in graph-tool to help in reading links lists? I
tried loading a 4000 vertices graph and it took 60 secs to load it and it's
consuming 600MB. Is that normal? I'm not sure if my way of loading the
graph is the correct and common one that is used by everyone. I hope if
someone just can give it a look. I would be really thankful!

Best,
Mang

attachment.html (4.74 KB)