This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
''' | |
Non-parametric computation of entropy and mutual-information | |
Adapted by G Varoquaux for code created by R Brette, itself | |
from several papers (see in the code). | |
This code is maintained at https://github.com/mutualinfo/mutual_info | |
Please download the latest code there, to have improvements and | |
bug fixes. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# import statments | |
import numpy | |
import re | |
''' | |
Tokenize each the sentences, example | |
Input : "John likes to watch movies. Mary likes movies too" | |
Ouput : "John","likes","to","watch","movies","Mary","likes","movies","too" | |
''' | |
def tokenize(sentences): |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#!/usr/bin/env python | |
# coding: utf-8 | |
from nltk.corpus import stopwords | |
from nltk.cluster.util import cosine_distance | |
import numpy as np | |
import networkx as nx | |
def read_article(file_name): | |
file = open(file_name, "r") |