Skip to content

Instantly share code, notes, and snippets.

@parajain
Created October 26, 2018 09:44
Show Gist options
  • Save parajain/5bd67ed49c358e09205509367325791c to your computer and use it in GitHub Desktop.
Save parajain/5bd67ed49c358e09205509367325791c to your computer and use it in GitHub Desktop.
numpy log normalization and log softmax implementation
import numpy as np
def log_softmax(x):
e_x = np.exp(x - np.max(x))
return np.log(e_x / e_x.sum())
def lognormalize(x):
a = np.logaddexp.reduce(x)
return np.exp(x - a)
scores = [3.0, 1.0, 0.2]
sm = log_softmax(scores)
print(sm)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment