# Problems in reconstruction with LVBlockState()

Hi professor Peixoto, I am very sorry to bother, but I have been trying to use the network inference method with LV model and I cannot make it work. I would really appreciate any help.

• Graph-tool version 2.77_1
• Python version 3.12.5
• Operating system: macOS Sonoma 14.5

Code

``````import graph_tool.all as gt
import numpy as np
import matplotlib as mpl
import matplotlib.pyplot as plt
import scipy.sparse

# Generate samples
g = gt.collection.data["karate"].copy()
E = g.num_edges()
N = g.num_vertices()
s = g.new_vp("double", vals=  10+ 10 * np.random.random(g.num_vertices()))
r = g.new_vp("double", val= + 20)
w = g.new_ep("double", vals=np.random.normal(0., .4, g.num_edges()))
for v in g.vertices():
w[e] = - 1. #resets diagonal to -1
A = gt.adjacency(g, weight= w) #sparse
A_dense = A.toarray()

# Check linear stability
eigenvalues = np.linalg.eigvals(A_dense)
re_eigenvalues = np.real(eigenvalues)

sigma_ = 10.
ts = np.linspace(0, 1.5 , 1000)
state = gt.LVState(g, s=s, r=r, w=w, sigma= sigma_)
ss = []
for t in ts:
## RK not suited for stochastic ODEs, use Euler instead
ret = state.solve_euler(t, dt = 0.001)
ss.append(state.get_state().fa.copy())
ss = np.array(ss).T

#Reconstruction
state = gt.LVBlockState(ss)
for _ in range(20):
ret = state.mcmc_sweep(niter=10, verbose = True)
delta = abs(ret[0]) #entropy difference
print(delta)
``````

The problem is that mcmc_sweep() does not return anything. In fact, output is:

``````theta_mcmc_sweep:
theta_multiflip_mcmc_sweep:
``````

and then it gets stuck. Instead, when setting stochastic noise to zero, the output for this code:

``````for _ in range(20):
ret = state.mcmc_sweep(niter=10, verbose = False)
delta = abs(ret[0]) #entropy difference
print(delta)

u = state.get_graph()
print(f"True graph. Number of nodes {g.num_vertices()}, number of edges= {g.num_edges()}")
print(f"Reconstructed graph. Number of nodes {u.num_vertices()}, number of edges= {u.num_edges()}")
w_r = state.get_x()
print(gt.similarity(g, u, w, w_r))
``````

is the following:

``````0.03653399749265418
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
True graph. Number of nodes 34, number of edges= 112
Reconstructed graph. Number of nodes 34, number of edges= 0
1.2037611980179927e-16
``````

If you check the values in the `ss` matrix you will find a bunch of `NaN`s… This is due to numerical instability of the Euler integration with the step size you have chosen. Try using something like `dt=1e-5`.

Thanks for replying. With a smaller dt there are no NaN values anymore. Still the problem with the reconstruction remains. Assuming everything above remains unchanged (except from dt modification), the following code:

``````state = gt.LVBlockState(ss)
for _ in range(10):
ret = state.mcmc_sweep(niter=10, verbose = False)
delta = abs(ret[0]) #entropy difference
print(f"delta = {delta}")

u = state.get_graph()      # reconstructed network
print(f"True graph. Number of nodes {g.num_vertices()}, number of edges= {g.num_edges()}")
print(f"Reconstructed graph. Number of nodes {u.num_vertices()}, number of edges= {u.num_edges()}")
w_r = state.get_x()        # reconstructed weights
print(gt.similarity(g, u, w, w_r))
``````

gives the output:

``````delta = 0.009212576853073529
delta = 0.0
delta = 0.0
delta = 0.0
delta = 0.0
delta = 0.0
delta = 0.0
delta = 0.0
delta = 0.0
delta = 0.0
True graph. Number of nodes 34, number of edges= 112
Reconstructed graph. Number of nodes 34, number of edges= 0
0.0
``````

And what exactly is the problem here?

Reconstruction from dynamics is not always possible. The accuracy will depend on the amount of data and dynamical regime.

Hi professor, thanks for replying. I read the paper “Network reconstruction via the Minimum description length principle” but, unfortunately, I only have a bachelors in physics and my knowledge is not enough to understand all the details of your algorithm. Could you give me some tips to understand when the reconstruction for LV is expected to work? A working example would be useful too.
Many thanks.