Segmentation fault in Max Cardinality Matching

Hello everyone,

unfortunately, I run into a segmentation fault using max cardinality matching.
I have a rather sparse adjacency matrix, but when the number of nodes exceeds approximately 1000-2000, my script crashes. I have tried the brute_force flag, but it also leads to a segmentation fault. A graph with fewer vertices and a siginificantly higher number of edges works fine, both with the regular and the brute force approach.

I am using Version 2.88 (commit 4c48e049) installed via conda on Linux.

Minimal working example:

import numpy as np
from graph_tool.topology import max_cardinality_matching
import graph_tool.all as gt
from scipy.sparse import csr_matrix

t = 3
distances = np.random.normal(0, 1, [5000, 5000])
distances *= (distances > t)
distances = np.triu(distances, k=1) 
sparse_distances = csr_matrix(distances)
G = gt.Graph(sparse_distances, directed=False)

print(len(G.get_edges()), "edges", len(G.get_vertices()), "nodes") #16837 edges 5000 nodes
assert sum(1 for e in G.edges() if e.source() == e.target()) == 0 # No self-loops
assert len(set(G.edges())) == G.num_edges() # No parallel edges

matching = max_cardinality_matching(G, weight=G.edge_properties["weight"], minimize=False) # SEGFAULT
print("matching done")

I would greatly appreaciate any comments or help on this!
Thank you,
Anna