Skip to content

Instantly share code, notes, and snippets.

Avatar

Jeremy Jordan jeremyjordan

View GitHub Profile
@jeremyjordan
jeremyjordan / button_example.md
Last active September 19, 2022 03:34
Streamlit button example
View button_example.md
@jeremyjordan
jeremyjordan / pages.py
Last active February 7, 2021 12:15
Streamlit Paginated Example
View pages.py
import numpy as np
import streamlit as st
from .streamlit_utils import SessionState
session_state = SessionState.get(page=1)
def main():
# Render the readme as markdown using st.markdown.
@jeremyjordan
jeremyjordan / soft_dice_loss.py
Last active November 25, 2021 15:05
Generic calculation of the soft Dice loss used as the objective function in image segmentation tasks.
View soft_dice_loss.py
def soft_dice_loss(y_true, y_pred, epsilon=1e-6):
'''
Soft dice loss calculation for arbitrary batch size, number of classes, and number of spatial dimensions.
Assumes the `channels_last` format.
# Arguments
y_true: b x X x Y( x Z...) x c One hot encoding of ground truth
y_pred: b x X x Y( x Z...) x c Network output, must sum to 1 over c channel (such as after softmax)
epsilon: Used for numerical stability to avoid divide by zero errors
@jeremyjordan
jeremyjordan / sgdr.py
Last active October 1, 2022 14:01
Keras Callback for implementing Stochastic Gradient Descent with Restarts
View sgdr.py
from keras.callbacks import Callback
import keras.backend as K
import numpy as np
class SGDRScheduler(Callback):
'''Cosine annealing learning rate scheduler with periodic restarts.
# Usage
```python
schedule = SGDRScheduler(min_lr=1e-5,
@jeremyjordan
jeremyjordan / step_decay_schedule.py
Created March 2, 2018 01:24
Example implementation of LearningRateScheduler with a step decay schedule
View step_decay_schedule.py
import numpy as np
from keras.callbacks import LearningRateScheduler
def step_decay_schedule(initial_lr=1e-3, decay_factor=0.75, step_size=10):
'''
Wrapper function to create a LearningRateScheduler with step decay schedule.
'''
def schedule(epoch):
return initial_lr * (decay_factor ** np.floor(epoch/step_size))
@jeremyjordan
jeremyjordan / lr_finder.py
Last active March 3, 2022 08:46
Keras Callback for finding the optimal range of learning rates
View lr_finder.py
import matplotlib.pyplot as plt
import keras.backend as K
from keras.callbacks import Callback
class LRFinder(Callback):
'''
A simple callback for finding the optimal learning rate range for your model + dataset.