Skip to content

Instantly share code, notes, and snippets.

View danieljoserodriguez's full-sized avatar
🎯
Focusing

Daniel J. Rodriguez danieljoserodriguez

🎯
Focusing
View GitHub Profile
@danieljoserodriguez
danieljoserodriguez / activations.py
Last active August 25, 2019 09:29
Machine Learning Activation Functions
# Daniel J. Rodriguez
# https://github.com/danieljoserodriguez
import numpy as np
# A straight line function where activation is proportional to input
# ( which is the weighted sum from neuron ).
# In mathematics, an identity function, also called an identity relation or
@danieljoserodriguez
danieljoserodriguez / nn_multiple_neurons_and_layers.py
Last active August 21, 2019 05:41
Neural Network - Python - Simplest Multiple Neurons and Layers - Feed Forward
# Daniel J. Rodriguez
# https://github.com/danieljoserodriguez
# This Gist is for information purposes only to demonstrate how to perform the task at hand.
# I do not advise using this in a production environment - rather - for learning on your own
# multiple inputs and layers neural network
import numpy as np
# neuron layers
@danieljoserodriguez
danieljoserodriguez / nn_multiple_neurons.py
Last active August 21, 2019 05:41
Neural Network - Python - Simplest Multiple Neurons
# Daniel J. Rodriguez
# https://github.com/danieljoserodriguez
# This Gist is for information purposes only to demonstrate how to perform the task at hand.
# I do not advise using this in a production environment - rather - for learning on your own.
# simple multiple inputs neural network
########################################################################
# just python - no extra libraries
def sum_weights(inputs, weights):
@danieljoserodriguez
danieljoserodriguez / nn_single_neuron.py
Last active August 21, 2019 05:40
Neural Network - Python - Simplest Single Neuron
# Daniel J. Rodriguez
# https://github.com/danieljoserodriguez
# This Gist is for information purposes only to demonstrate how to perform the task at hand.
# I do not advise using this in a production environment - rather - for learning on your own.
# simplest neural network - single input / neuron
neuron_weight = 0.1
test_scores = [99, 75]
neuron_prediction = test_scores[0] * neuron_weight
@danieljoserodriguez
danieljoserodriguez / losses.py
Last active August 21, 2019 05:40
Machine Learning Loss Functions
# Daniel J. Rodriguez
# https://github.com/danieljoserodriguez
import numpy as np
# Cross-entropy loss, or log loss, measures the performance of a classification model whose output is a
# probability value between 0 and 1
def cross_entropy(y_hat, y):
return -np.log(y_hat) if y == 1 else -np.log(1 - y_hat)