Skip to content

Instantly share code, notes, and snippets.

View RafayAK's full-sized avatar
🎯
Focusing

RafayAK RafayAK

🎯
Focusing
View GitHub Profile
NotesArchiver-AP-Lab
package Crawler;
/**
* Created by Sanitarium on 3/4/2016.
*/
import java.util.*;
public class WorkQueue {
//
package CoinChange;
import java.util.ArrayList;
import java.util.Iterator;
import java.util.TreeSet;
/**
* Created by Sanitarium on 4/15/2016.
*/
package Interpreter;
/**
* Created by Sanitarium on 4/29/2016.
*/
import javafx.util.Pair;
import java.io.*;
import java.*;
@RafayAK
RafayAK / LinearLayer.py
Last active April 12, 2023 15:42
Class for Linear Layer
import numpy as np # import numpy library
from util.paramInitializer import initialize_parameters # import function to initialize weights and biases
class LinearLayer:
"""
This Class implements all functions to be executed by a linear layer
in a computational graph
Args:
@RafayAK
RafayAK / ActivationLayer.py
Created June 19, 2019 09:17
Class for Sigmoid Activation Layer
import numpy as np # import numpy library
class SigmoidLayer:
"""
This file implements activation layers
inline with a computational graph model
Args:
shape: shape of input to the layer
@RafayAK
RafayAK / paramInitializer.py
Created June 19, 2019 09:19
Helper function to initialize weights and biases
import numpy as np
def initialize_parameters(n_in, n_out, ini_type='plain'):
"""
Helper function to initialize some form of random weights and Zero biases
Args:
n_in: size of input layer
n_out: size of output/number of neurons
ini_type: set initialization type for weights
@RafayAK
RafayAK / compte_cost.py
Last active June 20, 2019 09:02
This helper function computes the squared error cost and its derivative
def compute_cost(Y, Y_hat):
"""
This function computes and returns the Cost and its derivative.
The is function uses the Squared Error Cost function -> (1/2m)*sum(Y - Y_hat)^.2
Args:
Y: labels of data
Y_hat: Predictions(activations) from a last layer, the output layer
Returns:
@RafayAK
RafayAK / defining_layers.py
Last active June 20, 2019 11:47
Defining a 2 layer neural net
# define training constants
learning_rate = 1
number_of_epochs = 5000
np.random.seed(48) # set seed value so that the results are reproduceable
# (weights will now be initailzaed to the same pseudo-random numbers, each time)
# Our network architecture has the shape:
# (input)--> [Linear->Sigmoid] -> [Linear->Sigmoid] -->(output)
@RafayAK
RafayAK / training_loop.py
Created June 20, 2019 11:48
The main training loop for 2-layer nn
costs = [] # initially empty list, this will store all the costs after a certian number of epochs
# Start training
for epoch in range(number_of_epochs):
# ------------------------- forward-prop -------------------------
Z1.forward(X_train)
A1.forward(Z1.Z)
Z2.forward(A1.A)