Hi team.

I'm wondering whether you could help me to see what is happening with your reduced_mutual_information() function because of several mismatching outputs I found on this implementation.

1. RMI is a value between [0, 1], but why in your example the output is negative if I compare two partition?

x = np.random.randint(0, 10, 1000)

y = np.random.randint(0, 10, 1000)

gt.reduced_mutual_information(x, y)

-0.065562...

2. In your example, you create sort of two partitions from a random distribution, Is it not the specific case when RMI is zero, or very close to zero?

3. When I use the exact partitions Newman offer in your own code (wine.txt), your function gives

0.7890319931250596

But the Newman function gives

Reduced mutual information M = 1.21946279985 bits per object

Why do these results are so different or how can we associate them?

4. Finally, what is (or where is) the description of the format one must pass the partitions to the function?

I mean, I'm confused about how x (or y) variables should arranged. Each row index is the node label? If so, how to write nodes sharing several partitions?

Thanks in advance for your answers and congratulation for creating this tool!

JM