Skip to content

Instantly share code, notes, and snippets.

View lrnv's full-sized avatar

Oskar Laverny lrnv

View GitHub Profile
@vankesteren
vankesteren / Adamopt.jl
Last active February 25, 2023 05:35
Julia implementation of Adam optimizer
module Adamopt
# This is a module implementing vanilla Adam (https://arxiv.org/abs/1412.6980).
export Adam, step!
# Struct containing all necessary info
mutable struct Adam
theta::AbstractArray{Float64} # Parameter array
loss::Function # Loss function
grad::Function # Gradient function