Skip to content

Instantly share code, notes, and snippets.

@quaertym
Created February 15, 2017 22:44
Show Gist options
  • Save quaertym/087e596e50513f06e35f2faf33291c31 to your computer and use it in GitHub Desktop.
Save quaertym/087e596e50513f06e35f2faf33291c31 to your computer and use it in GitHub Desktop.
Replication of Knet.jl issue#74
using Knet
function softmax(x,axis=0)
return exp(logp(x, axis))
end
function loss(wbindings)
out = zeros()
for arg in 2:4
progargs = softmax(wbindings[arg-1],1)
out = progargs
end
return out
end
w = Any[Float32[0.562734,-0.33637],Float32[-1.21529,1.04808,-0.789922],Float32[-0.332284,0.325839,0.0651071,-0.618155]]
lossgradient = grad(loss)
lossgradient(w)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment