Skip to content

Instantly share code, notes, and snippets.

@taketakeyyy
Last active April 17, 2023 17:32
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save taketakeyyy/eaa515955973cf577fbfcabbf3672c1b to your computer and use it in GitHub Desktop.
Save taketakeyyy/eaa515955973cf577fbfcabbf3672c1b to your computer and use it in GitHub Desktop.
情報量としてのエントロピー増大とは?
# coding: utf-8
import math
def entropy(p_list):
ans = 0
for p in p_list:
ans += (p) * -math.log2(p)
return ans
p_list = [1.0]
a = entropy(p_list)
print(a)
# 0.0
p_list = [0.5, 0.5]
a = entropy(p_list)
print(a)
# 1.0
p_list = [1/3, 1/3, 1/3]
a = entropy(p_list)
print(a)
# 1.584962500721156
p_list = [0.1, 0.1, 0.1, 0.1, 0.5]
a = entropy(p_list)
print(a)
# 1.828771237954945
p_list = [0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1]
a = entropy(p_list)
print(a)
# 3.321928094887362
p_list = [0.1, 0.2, 0.3, 0.05, 0.05, 0.05, 0.05, 0.05, 0.05, 0.1]
a = entropy(p_list)
print(a)
# 2.946439344671015
p_list = [0.1, 0.2, 0.3, 0.05, 0.05, 0.05, 0.05, 0.05, 0.05, 0.05, 0.05]
a = entropy(p_list)
print(a)
# 3.046439344671015
# 確率0.0001の事象を1000個作成し、そのエントロピーを計算する
p_list = [1/1000 for _ in range(1000)]
a = entropy(p_list)
print(a)
# 9.965784284662018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment