Skip to content

Instantly share code, notes, and snippets.

View sjoshiAI's full-sized avatar

QPyWarrior sjoshiAI

  • Edinburgh, UK
View GitHub Profile
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Fair coin = 0.5 probability for head.
Single toss = 2 outcomes
Entropy = -0.5*log(0.5) - 0.5*log(0.5) = 1
N tosses = 2^N outcomes
Entropy = -2^N*(1/2^N)*log((1/2^N)) = N
Consider a biased coin
p(head) = 1, p(tails) = 0
Entropy = -1*log(1) - 0*log(0) = 0 (assuming log(0) is defined)