Skip to content

Instantly share code, notes, and snippets.

@stober
Created March 1, 2012 03:05
Show Gist options
  • Save stober/1946926 to your computer and use it in GitHub Desktop.
Save stober/1946926 to your computer and use it in GitHub Desktop.
Softmax in Python
#! /usr/bin/env python
"""
Author: Jeremy M. Stober
Program: SOFTMAX.PY
Date: Wednesday, February 29 2012
Description: Simple softmax function.
"""
import numpy as np
npa = np.array
def softmax(w, t = 1.0):
e = np.exp(npa(w) / t)
dist = e / np.sum(e)
return dist
if __name__ == '__main__':
w = np.array([0.1,0.2])
print softmax(w)
w = np.array([-0.1,0.2])
print softmax(w)
w = np.array([0.9,-10])
print softmax(w)
@AFAgarap
Copy link

AFAgarap commented Sep 6, 2017

labels = [0, 0, 0, 0, 0.68, 0.32, 0, 0, 0, 0]

%timeit softmax = np.exp([element for element in labels]) / np.sum(np.exp([element for element in labels]))
The slowest run took 5.03 times longer than the fastest. This could mean that an intermediate result is being cached.
100000 loops, best of 3: 12.2 µs per loop

@rajans
Copy link

rajans commented Mar 21, 2018

It can be simple one liner.

def softmax(x):
return np.exp(x)/np.sum(np.exp(x),axis=0)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment