Maximum-Matching using graph-tool


I am working on a project where I need to use the maximum cardinality matching algorithm. My graphs are complete and weighted. However, even for moderately large graph instances, I am facing a segmentation fault error. I have complete graphs with 400-5000 vertices (I would like to run the algorithm on much bigger graphs too). The algorithm works fine for smaller instances. It seems that this is a memory issue. My machine has more than enough memory to run such instances (16 GB).

Further, earlier I wasn't even able to run graphs with 100-200 nodes. I was able to run them after changing the edge weight property type from "double" to "int_16".

Please let me know if there is something else I can do to run this code. It's a pretty large and distributed codebase, so it would be difficult for me to post my code for reference over here.

On the other hand, if you feel that this is an issue with the library itself, please fix it.

Thanks and Regards
Chaitanya Agarwal

Dear Chaitanya,


My original package is too convoluted at this point and it would have been difficult to post that. However, based on your suggestion, I tried to create an independent script to replicate the code. On instances with similar sizes, I wasn't able to reproduce the error. This made me realize that there must be some issue with my code rather than the library. I tried to optimize my code and now I am not getting the same error.

So thanks a lot for your reply and forgive me for creating this issue without conducting proper analysis myself.

But on a separate note, can you suggest me some general tips to optimize memory usage while using this library? For example, I found the shrink to fit function for edgePropertyMap and graph, highly impactful. Are there any other such implementations, tools or tricks that one can use?

Again, it is difficult to say without any particular context. The
function shrink_to_fit() should help only after the graph has been
heavily modified, not otherwise. So in order for us to know if there is
any technique that can improve memory use in your particular
circumstances, we have to know first what it is...