Skip to content

Instantly share code, notes, and snippets.

View neodelphis's full-sized avatar

Pierre Jaumier neodelphis

View GitHub Profile
@neodelphis
neodelphis / conv_forward_naive.py
Created July 17, 2019 16:33
A naive implementation of the forward pass for a convolutional layer.
def conv_forward_naive(x, w, b, conv_param):
"""
A naive implementation of the forward pass for a convolutional layer.
The input consists of N data points, each with C channels, height H and
width W. We convolve each input with F different filters, where each filter
spans all C channels and has height HH and width WW.
Input:
- x: Input data of shape (N, C, H, W)
@neodelphis
neodelphis / conv_backward_naive_w_stride.py
Created July 15, 2019 10:08
conv backward naive with stride
def conv_backward_naive(dout, cache):
"""
A naive implementation of the backward pass for a convolutional layer.
Inputs:
- dout: Upstream derivatives.
- cache: A tuple of (x, w, b, conv_param) as in conv_forward_naive
Returns a tuple of:
- dx: Gradient with respect to x
@neodelphis
neodelphis / conv_backward_naive.py
Last active May 31, 2021 12:31
Backprop in a conv layer
def conv_backward_naive(dout, cache):
"""
A naive implementation of the backward pass for a convolutional layer.
Inputs:
- dout: Upstream derivatives.
- cache: A tuple of (x, w, b, conv_param) as in conv_forward_naive
Returns a tuple of:
- dx: Gradient with respect to x
@neodelphis
neodelphis / BP-softmax-layer.ipynb
Created June 3, 2019 13:38
Rétropropagation du gradient dans le cadre d'une ultime couche Softmax
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.