Skip to content

Instantly share code, notes, and snippets.

@radekk
Created June 7, 2017 18:16
Show Gist options
  • Star 2 You must be signed in to star a gist
  • Fork 1 You must be signed in to fork a gist
  • Save radekk/3d9923cb54e8c0ac7ca55cdc319dd363 to your computer and use it in GitHub Desktop.
Save radekk/3d9923cb54e8c0ac7ca55cdc319dd363 to your computer and use it in GitHub Desktop.
Calculating Shannon's entropy with JavaScript
/**
* Calculate Shannon's entropy for a string
*/
module.exports = (str) => {
const set = {};
str.split('').forEach(
c => (set[c] ? set[c]++ : (set[c] = 1))
);
return Object.keys(set).reduce((acc, c) => {
const p = set[c] / str.length;
return acc - (p * (Math.log(p) / Math.log(2)));
}, 0);
};
@mcnemesis
Copy link

Hmm, am not really that expert in Information Theory, but I wonder, can this implementation (seemingly simple), be trusted as legit? In my case, I wish to use this to compute the entropy of each string in a list, and pick the one with potentially the most information - am assuming that the higher the entropy, the more information a string has, is this correct?

Copy link

ghost commented Jul 13, 2018

@mcnemesis Yes.

Base-2 Shannon entropy calculation, it will give you a number from 0 to 8,
it is a simple frequency-calculation for each character, factorised into a specific range.

alternative that does not use the => syntax.

@jaysonmulwa
Copy link

Nice. We can modernize and use ES6 set object:

const set = new Set();

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment