Skip to content

Instantly share code, notes, and snippets.

🎯
Focusing

RafayAK RafayAK

🎯
Focusing
Block or report user

Report or block RafayAK

Hide content and notifications from this user.

Learn more about blocking users

Contact Support about this user’s behavior.

Learn more about reporting abuse

Report abuse
View GitHub Profile
View inference_example.py
import keras
# import keras_retinanet
from object_detector_retinanet.keras_retinanet import models
from object_detector_retinanet.keras_retinanet.utils.image import read_image_bgr, preprocess_image, resize_image
from object_detector_retinanet.keras_retinanet.utils.visualization import draw_box, draw_caption
from object_detector_retinanet.keras_retinanet.utils.colors import label_color
# import for EM Merger and viz
from object_detector_retinanet.keras_retinanet.utils import EmMerger
@RafayAK
RafayAK / training.py
Last active Nov 10, 2019
training loop for iris data
View training.py
costs = [] # initially empty list, this will store all the costs after a certain number of epochs
# Start training
for epoch in range(number_of_epochs):
# ------------------------- forward-prop -------------------------
Z1.forward(X_train)
A1.forward(Z1.Z)
# ---------------------- Compute Cost ----------------------------
@RafayAK
RafayAK / define_layer.py
Created Nov 10, 2019
Define layers for 1 layer nn to discriminate between Iris-virginica vs. others, using only petal length and petal width as input features
View define_layer.py
# define training constants
learning_rate = 1
number_of_epochs = 5000
np.random.seed(48) # set seed value so that the results are reproduceable
# (weights will now be initailzaed to the same pseudo-random numbers, each time)
# Our network architecture has the shape:
# (input)--> [Linear->Sigmoid] -->(output)
@RafayAK
RafayAK / computer_keras_like_bce_cost.py
Last active Nov 10, 2019
This function computes the Binary Cross-Entropy(stable_bce) Cost function the way Keras implements it
View computer_keras_like_bce_cost.py
def compute_keras_like_bce_cost(Y, P_hat, from_logits=False):
"""
This function computes the Binary Cross-Entropy(stable_bce) Cost function the way Keras
implements it. Accepting either probabilities(P_hat) from the sigmoid neuron or values direct
from the linear node(Z)
Args:
Y: labels of data
P_hat: Probabilities from sigmoid function
from_logits: flag to check if logits are being provided or not(Default: False)
@RafayAK
RafayAK / compute_stable_bce_cost.py
Created Nov 10, 2019
This function computes the stable version of BCE cost
View compute_stable_bce_cost.py
def compute_stable_bce_cost(Y, Z):
"""
This function computes the "Stable" Binary Cross-Entropy(stable_bce) Cost and returns the Cost and its
derivative w.r.t Z_last(the last linear node) .
The Stable Binary Cross-Entropy Cost is defined as:
=> (1/m) * np.sum(max(Z,0) - ZY + log(1+exp(-|Z|)))
Args:
Y: labels of data
Z: Values from the last linear node
@RafayAK
RafayAK / compute_bce_cost.py
Created Nov 8, 2019
The gist contains the code for computing the "unstable" BCE cost and it's derivative
View compute_bce_cost.py
def compute_bce_cost(Y, P_hat):
"""
This function computes Binary Cross-Entropy(bce) Cost and returns the Cost and its
derivative.
This function uses the following Binary Cross-Entropy Cost defined as:
=> (1/m) * np.sum(-Y*np.log(P_hat) - (1-Y)*np.log(1-P_hat))
Args:
Y: labels of data
P_hat: Estimated output probabilities from the last layer, the output layer
@RafayAK
RafayAK / training_loop.py
Created Jun 20, 2019
The main training loop for 2-layer nn
View training_loop.py
costs = [] # initially empty list, this will store all the costs after a certian number of epochs
# Start training
for epoch in range(number_of_epochs):
# ------------------------- forward-prop -------------------------
Z1.forward(X_train)
A1.forward(Z1.Z)
Z2.forward(A1.A)
@RafayAK
RafayAK / defining_layers.py
Last active Jun 20, 2019
Defining a 2 layer neural net
View defining_layers.py
# define training constants
learning_rate = 1
number_of_epochs = 5000
np.random.seed(48) # set seed value so that the results are reproduceable
# (weights will now be initailzaed to the same pseudo-random numbers, each time)
# Our network architecture has the shape:
# (input)--> [Linear->Sigmoid] -> [Linear->Sigmoid] -->(output)
@RafayAK
RafayAK / compte_cost.py
Last active Jun 20, 2019
This helper function computes the squared error cost and its derivative
View compte_cost.py
def compute_cost(Y, Y_hat):
"""
This function computes and returns the Cost and its derivative.
The is function uses the Squared Error Cost function -> (1/2m)*sum(Y - Y_hat)^.2
Args:
Y: labels of data
Y_hat: Predictions(activations) from a last layer, the output layer
Returns:
@RafayAK
RafayAK / paramInitializer.py
Created Jun 19, 2019
Helper function to initialize weights and biases
View paramInitializer.py
import numpy as np
def initialize_parameters(n_in, n_out, ini_type='plain'):
"""
Helper function to initialize some form of random weights and Zero biases
Args:
n_in: size of input layer
n_out: size of output/number of neurons
ini_type: set initialization type for weights
You can’t perform that action at this time.