Skip to content

Instantly share code, notes, and snippets.

@simonkamronn
simonkamronn / hyperband.py
Created November 11, 2016 16:06
Hyperband for hyperparameter optimization
# https://people.eecs.berkeley.edu/~kjamieson/hyperband.html
# you need to write the following hooks for your custom problem
from problem import get_random_hyperparameter_configuration,run_then_return_val_loss
max_iter = 81 # maximum iterations/epochs per configuration
eta = 3 # defines downsampling rate (default=3)
logeta = lambda x: log(x)/log(eta)
s_max = int(logeta(max_iter)) # number of unique executions of Successive Halving (minus one)
B = (s_max+1)*max_iter # total number of iterations (without reuse) per execution of Succesive Halving (n,r)
@udibr
udibr / gruln.py
Last active November 7, 2020 02:34
Keras GRU with Layer Normalization
import numpy as np
from keras.layers import GRU, initializations, K
from collections import OrderedDict
class GRULN(GRU):
'''Gated Recurrent Unit with Layer Normalization
Current impelemtation only works with consume_less = 'gpu' which is already
set.
# Arguments