"Segmentation fault (core dumped)" and "Aborted (core dumped)"

Hello,

I have been running a piece of code from the cookbook on a dataset of mine.
I last ran it a year ago when it worked fine, now I however am getting error
messages. The first one is "Segmentation fault (core dumped)" which I think
is happening during a call to "state = gt.minimize_nested_blockmodel_dl(g,
deg_corr=False,overlap=True)". The other one is "terminate called after
throwing an instance of 'std::length_error'
  what(): vector::_M_default_append
Aborted (core dumped)" which seems to be happening during
"gt.mcmc_equilibrate(state,force_niter=20000,
mcmc_args=dict(niter=10),callback=collect_marginals)". I am running the
latest apt-get version of graph-tool on python2 on Ubuntu 16.04. I realise
this may not be worth much without the data set (which I unfortunately can't
share) but I thought I would check whether anybody has observed similar
issue (my theory is that given it has worked in the past it might be linked
to a version update). I will see if I can get a dataset for an MWE in the
meantime.

The code I was running was:

import graph_tool.all as gt
import numpy as np
import cPickle as pickle
import timeit

g = gt.load_graph('graph_no_multi_1930.gt')

with open('model_selection_results_1930.dat','a') as output:
    deg_corr = False
    overlap = True
    nL = 10
    
    # Initialize the Markov chain from the "ground state"
    state = gt.minimize_nested_blockmodel_dl(g,
deg_corr=deg_corr,overlap=overlap)
    dl = state.entropy()
    temp = "Description length: "
    temp+= str(dl)
    temp+='\n'
    output.write(temp)
    output.flush()
    print 'minimised state'
    bs = state.get_bs() # Get hierarchical partition.
    bs += [np.zeros(1)] * (nL - len(bs)) # Augment it to L = 10 with
                                            # single-group levels.

    state = state.copy(bs=bs, sampling=True)

    dls = [] # description length history
    vm = [None] * len(state.get_levels()) # vertex marginals
    em = None # edge marginals

    def collect_marginals(s):
        global vm, em
        levels = s.get_levels()
        vm = [sl.collect_vertex_marginals(vm[l]) for l, sl in
enumerate(levels)]
        em = levels[0].collect_edge_marginals(em)
        dls.append(s.entropy())

    # Now we collect the marginal distributions for exactly 200,000 sweeps
    print 'equilibrating'
    start=timeit.default_timer()
    gt.mcmc_equilibrate(state, force_niter=20000, mcmc_args=dict(niter=10),
                        callback=collect_marginals)
    duration=timeit.default_timer()-start
    print 'duration for equilibrating: ', duration

    S_mf = [gt.mf_entropy(sl.g, vm[l]) for l, sl in
enumerate(state.get_levels())]
    S_bethe = gt.bethe_entropy(g, em)[0]
    L = -np.mean(dls)
    
    val1 = L+sum(S_mf)
    val2 = L + S_bethe + sum(S_mf[1:])
    temp = "Model evidence for nested blockmodel, deg_corr = "
    temp+=str(deg_corr)+', overlap = '+str(overlap)
    temp+=':'+str(val1)+"(mean field),"
    temp+=str(val2)+"(Bethe)"+'\n'
    output.write(temp)
    output.flush()
    with open('raw_results.dat','w') as f:
        f.write('DL: ')
        f.write(str(L))
        f.write('\n')
        f.write('S_mf: ')
        f.write(str(sum(S_mf)))
        f.write('\n')
        f.write('S_bethe: ')
        f.write(str(S_bethe))
        f.write('\n')
        f.write('S_bethe + sum(S_mf[1:]): ')
        f.write(str(S_bethe + sum(S_mf[1:])))

Best wishes,

Philipp

Update: as far as I can tell the Segmentation fault also arises if I am using
the lesmis data collection which hopefully makes this more reproducible for
others:

import graph_tool.all as gt
import numpy as np
import cPickle as pickle
import timeit

#g = gt.load_graph('graph_no_multi_1930.gt')
g = gt.collection.data["lesmis"]

#with open('model_selection_results_1930.dat','a') as output:
with open('model_selection_results_les_mis.dat','a') as output:
    deg_corr = False
    overlap = True
    nL = 10
    
    # Initialize the Markov chain from the "ground state"
    state = gt.minimize_nested_blockmodel_dl(g,
deg_corr=deg_corr,overlap=overlap)
    dl = state.entropy()
    temp = "Description length: "
    temp+= str(dl)
    temp+='\n'
    output.write(temp)
    output.flush()
    print 'minimised state'
    bs = state.get_bs() # Get hierarchical partition.
    bs += [np.zeros(1)] * (nL - len(bs)) # Augment it to L = 10 with
                                            # single-group levels.

    state = state.copy(bs=bs, sampling=True)

    dls = [] # description length history
    vm = [None] * len(state.get_levels()) # vertex marginals
    em = None # edge marginals

    def collect_marginals(s):
        global vm, em
        levels = s.get_levels()
        vm = [sl.collect_vertex_marginals(vm[l]) for l, sl in
enumerate(levels)]
        em = levels[0].collect_edge_marginals(em)
        dls.append(s.entropy())

    # Now we collect the marginal distributions for exactly 200,000 sweeps
    print 'equilibrating'
    start=timeit.default_timer()
    gt.mcmc_equilibrate(state, force_niter=20000, mcmc_args=dict(niter=10),
                        callback=collect_marginals)
    duration=timeit.default_timer()-start
    print 'duration for equilibrating: ', duration

    S_mf = [gt.mf_entropy(sl.g, vm[l]) for l, sl in
enumerate(state.get_levels())]
    S_bethe = gt.bethe_entropy(g, em)[0]
    L = -np.mean(dls)
    
    val1 = L+sum(S_mf)
    val2 = L + S_bethe + sum(S_mf[1:])
    temp = "Model evidence for nested blockmodel, deg_corr = "
    temp+=str(deg_corr)+', overlap = '+str(overlap)
    temp+=':'+str(val1)+"(mean field),"
    temp+=str(val2)+"(Bethe)"+'\n'
    output.write(temp)
    output.flush()
    with open('raw_results.dat','w') as f:
        f.write('DL: ')
        f.write(str(L))
        f.write('\n')
        f.write('S_mf: ')
        f.write(str(sum(S_mf)))
        f.write('\n')
        f.write('S_bethe: ')
        f.write(str(S_bethe))
        f.write('\n')
        f.write('S_bethe + sum(S_mf[1:]): ')
        f.write(str(S_bethe + sum(S_mf[1:])))

This seems to run fine with the current git version. Can you verify?

Best,
Tiago

I will try, certainly. My system has changed a bit since my last compile
though so this may quite possibly be a lengthy process. Are you able to push
the current git version to apt-get?

No, you would have to wait until the next release.