Created
September 7, 2017 00:24
-
-
Save f0nzie/3009561c52cd7f4233068e50686301cd to your computer and use it in GitHub Desktop.
softmax
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
def softmax(x): | |
"""Calculates the softmax for each row of the input x. | |
Your code should work for a row vector and also for matrices of shape (n, m). | |
Argument: | |
x -- A numpy matrix of shape (n,m) | |
Returns: | |
s -- A numpy matrix equal to the softmax of x, of shape (n,m) | |
""" | |
# Apply exp() element-wise to x. Use np.exp(...). | |
x_exp = np.exp(x) | |
# Create a vector x_sum that sums each row of x_exp. Use np.sum(..., axis = 1, keepdims = True). | |
x_sum = np.sum(x_exp, axis = 1, keepdims = True) | |
# Compute softmax(x) by dividing x_exp by x_sum. It should automatically use numpy broadcasting. | |
s = x_exp / x_sum | |
return s |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment