Skip to content

Instantly share code, notes, and snippets.

@97k
Created March 18, 2019 09:34
Show Gist options
  • Save 97k/e78bd0c9d78de13432955554e8f3d07b to your computer and use it in GitHub Desktop.
Save 97k/e78bd0c9d78de13432955554e8f3d07b to your computer and use it in GitHub Desktop.
def initialize_params(self, architecture):
# We'll save parameters in a dictionary so that, we can acess them later on
params = {}
for id_, layer in enumerate(architecture):
# We're starting our layer from 1, There's a reason for this, think about it.
layer_id = id_ + 1
# With help of architecture provided, we'll get dimensions for each layer.
input_dim = layer['input_dim']
output_dim = layer['output_dim']
# There are many ways to initialize weights, this one is naive way!
params['W'+str(layer_id)] = np.random.randn(output_dim, input_dim)*0.1
params['b'+str(layer_id)] = np.zeros((output_dim, 1))
return params
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment