Last active
February 2, 2019 15:35
-
-
Save madvn/73cba78623b7ac01e41a2c6b9bdebc86 to your computer and use it in GitHub Desktop.
A Python demonstration of using ''infotheory'' to estimate mutual information under various conditions
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
############################################################################### | |
# Mutual Information demo for Python package - infotheory | |
# https://github.com/madvn/infotheory | |
# | |
# Madhavun Candadai | |
# Dec, 2018 | |
# | |
# Mutual information should be high for identical variables, slightly lower for | |
# noisy identical variables and low for random variables | |
############################################################################### | |
## Setup | |
dims = 4 # total dimensionality of all variables = 2+2 = 4 | |
nreps = 1 # number of shifted binnings over which data is binned and averaged | |
nbins = [10]*dims # number of bins along each dimension of the data | |
mins = [0]*dims # min value or left edge of binning for each dimension | |
maxs = [1]*dims # max value or right edge of binning for each dimension | |
## Creating object | |
it = infotheory.InfoTools(dims, nreps) | |
## Specify binning | |
it.set_equal_interval_binning(nbins, mins, maxs) | |
## Adding data - concatenate data from all vars | |
for _ in range(10000): | |
it.add_data_point(np.random.rand(dims)) | |
## Invoke infotheory tools | |
varIDs = [0,0,1,1] # to identify the different vars | |
mi = it.mutual_info(varIDs) # mutual information between two random vars | |
mi /= np.log2(np.min(nbins)) # normalizing | |
print('Mutual information between the two random 2D data = {}'.format(mi)) | |
varIDs = [0,-1,1,-1] # dimensions marked anything other than 0 or 1 are ignored for mutual information | |
mi = it.mutual_info(varIDs) # mutual information between first dimension two 2D random vars | |
mi /= np.log2(np.min(nbins)) # normalizing | |
print('Mutual information between first dimension of two random 2D data = {}'.format(mi)) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment