Skip to content

Instantly share code, notes, and snippets.

@George3d6
Created August 14, 2018 17:30
Show Gist options
  • Save George3d6/efd4bf03cfdffcffca6c687b8cec1b81 to your computer and use it in GitHub Desktop.
Save George3d6/efd4bf03cfdffcffca6c687b8cec1b81 to your computer and use it in GitHub Desktop.
leakyrelu(x;α=Float32(0.2)) = max(0,x) + α*min(0,x) # LeakyRelu activation
# A generic MLP forward prop function, original code: https://github.com/ekinakyurek/GAN-70-Lines-of-Julia/blob/66a60a6ea4532841ee647f08759ae9b1ace0c892/gan.jl#L6
function forward_prop(W,X;dropout_p=0.0)
for i=1:2:length(W)
X = W[i]*dropout(mat(X),dropout_p) .+ W[i+1] # mat(X) flattens X to an
i < length(W)-1 && (X = leakyrelu.(X))
end
sigm.(X)
end
# Forward prop for the discriminator and generator respectively
D(w,x;dropout_p=0.0) = forward_prop(w,x;dropout_p=dropout_p) # Discriminator
G(w,z;dropout_p=0.0) = forward_prop(w,z;dropout_p=dropout_p) # Generator
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment