Skip to content

Instantly share code, notes, and snippets.

@f0nzie
Created September 7, 2017 00:24
Show Gist options
  • Save f0nzie/3009561c52cd7f4233068e50686301cd to your computer and use it in GitHub Desktop.
Save f0nzie/3009561c52cd7f4233068e50686301cd to your computer and use it in GitHub Desktop.
softmax
def softmax(x):
"""Calculates the softmax for each row of the input x.
Your code should work for a row vector and also for matrices of shape (n, m).
Argument:
x -- A numpy matrix of shape (n,m)
Returns:
s -- A numpy matrix equal to the softmax of x, of shape (n,m)
"""
# Apply exp() element-wise to x. Use np.exp(...).
x_exp = np.exp(x)
# Create a vector x_sum that sums each row of x_exp. Use np.sum(..., axis = 1, keepdims = True).
x_sum = np.sum(x_exp, axis = 1, keepdims = True)
# Compute softmax(x) by dividing x_exp by x_sum. It should automatically use numpy broadcasting.
s = x_exp / x_sum
return s
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment