Skip to content

Instantly share code, notes, and snippets.

View jeremyjordan's full-sized avatar

Jeremy Jordan jeremyjordan

View GitHub Profile
@jeremyjordan
jeremyjordan / step_decay_schedule.py
Created March 2, 2018 01:24
Example implementation of LearningRateScheduler with a step decay schedule
import numpy as np
from keras.callbacks import LearningRateScheduler
def step_decay_schedule(initial_lr=1e-3, decay_factor=0.75, step_size=10):
'''
Wrapper function to create a LearningRateScheduler with step decay schedule.
'''
def schedule(epoch):
return initial_lr * (decay_factor ** np.floor(epoch/step_size))
@jeremyjordan
jeremyjordan / lr_finder.py
Last active March 3, 2022 08:46
Keras Callback for finding the optimal range of learning rates
import matplotlib.pyplot as plt
import keras.backend as K
from keras.callbacks import Callback
class LRFinder(Callback):
'''
A simple callback for finding the optimal learning rate range for your model + dataset.
@jeremyjordan
jeremyjordan / button_example.md
Last active September 19, 2022 03:34
Streamlit button example
@jeremyjordan
jeremyjordan / create_gif.sh
Created May 13, 2023 14:56
Create GIF from images
# Create a directory with images named 1.jpg, 2.jpg, 3.jpg, etc.
# Generate a custom color palette
ffmpeg -framerate 0.5 -i %d.jpg \
-vf "format=rgba,fps=0.5,palettegen=stats_mode=diff" \
-y palette.png
# Create the gif
ffmpeg -framerate 0.5 -i %d.jpg -i palette.png \
-lavfi "format=rgba,fps=0.5,paletteuse=dither=none" \
@jeremyjordan
jeremyjordan / pages.py
Last active July 22, 2023 13:13
Streamlit Paginated Example
import numpy as np
import streamlit as st
from .streamlit_utils import SessionState
session_state = SessionState.get(page=1)
def main():
# Render the readme as markdown using st.markdown.
@jeremyjordan
jeremyjordan / soft_dice_loss.py
Last active September 20, 2023 12:50
Generic calculation of the soft Dice loss used as the objective function in image segmentation tasks.
def soft_dice_loss(y_true, y_pred, epsilon=1e-6):
'''
Soft dice loss calculation for arbitrary batch size, number of classes, and number of spatial dimensions.
Assumes the `channels_last` format.
# Arguments
y_true: b x X x Y( x Z...) x c One hot encoding of ground truth
y_pred: b x X x Y( x Z...) x c Network output, must sum to 1 over c channel (such as after softmax)
epsilon: Used for numerical stability to avoid divide by zero errors
@jeremyjordan
jeremyjordan / sgdr.py
Last active December 4, 2023 13:41
Keras Callback for implementing Stochastic Gradient Descent with Restarts
from keras.callbacks import Callback
import keras.backend as K
import numpy as np
class SGDRScheduler(Callback):
'''Cosine annealing learning rate scheduler with periodic restarts.
# Usage
```python
schedule = SGDRScheduler(min_lr=1e-5,