Skip to content

Instantly share code, notes, and snippets.

View bryanesmith's full-sized avatar
💭
I'll probably be very slow to respond.

Bryan Smith bryanesmith

💭
I'll probably be very slow to respond.
View GitHub Profile
@bryanesmith
bryanesmith / nonvectorized-logistic-regression-cost.m
Last active June 17, 2017 17:49
Non-vectorized calculation of logistic regression cost in Octave.
% - - - - - - - - - - - - -
% hVal, yVal are scalar values
% yVal ∈ { 0, 1 }
% - - - - - - - - - - - - -
function cost = calcCostTerm(hVal, yVal)
% equivalent to:
% if yVal == 1, sum -= log(hVal);
% elseif yVal == 0, sum -= log(1 - hVal);
cost = -yVal * log(hVal) - (1 - yVal) * log(1 - hVal);
endfunction
@bryanesmith
bryanesmith / vectorized-logistic-regression-cost.m
Created June 17, 2017 17:49
Vectorized calculation of logistic regression cost in Octave.
% - - - - - - - - - - - - -
% h is R^1,m
% y is R^1,m
% - - - - - - - - - - - - -
function cost = calcCost(h, y)
costs = -y .* log(h) - (1 - y) .* log(1 - h);
cost = sum(costs) / length(h);
endfunction
% - - - - - - - - - - - - -
@bryanesmith
bryanesmith / logistic-regression.m
Created June 17, 2017 21:04
Performing logistic regression with gradient descent in Octave.
function [theta, J] = runGradientDescent(x, y, theta, alpha, iters)
J = 0;
for i=0:iters-1
h = calcH(theta, x);
errors = h - y;
matrix = errors * x';
theta = theta - ( alpha / length(y) ) * matrix;
@bryanesmith
bryanesmith / linear-regression.m
Created June 17, 2017 21:53
Performing linear regression with gradient descent in Octave.
%
% x is R^n,m
% y is R^1,m
% theta is R^1,n
% alpha is float (e.g., 0.01)
% alpha is int (e.g., 10000)
%
function [theta, J] = runGradientDescent(x, y, theta, alpha, iters)
J = 0;
for i=0:iters-1
@bryanesmith
bryanesmith / logistic-regression-fminunc.m
Created June 18, 2017 21:15
Performing logistic regression with fminunc in Octave.
% H(theta) = theta1 + x1 * theta2 + x2 * theta3
global x = ...; % load R^n,m data
global y = ...; % load R^1,m outcomes
global n = length(x(:,1));
global m = length(x(1,:));
function cost = cost(h, y)
costs = -y .* log(h) - (1 - y) .* log(1 - h);
cost = sum(costs) / length(h);
@bryanesmith
bryanesmith / regularized-logistic-regression.m
Created June 18, 2017 22:24
Performing regularized logistic regression with gradient descent in Octave.
function [theta, J] = runGradientDescent(x, y, theta, alpha, lambda, iters)
J = 0;
m = length(y);
for i=0:iters-1
h = calcH(theta, x);
% todo: theta(0) should not include regularization: (lambda * theta) / m
theta = theta - alpha * ( (1 / m) * (h - y) * x' + (lambda * theta) / m );
@bryanesmith
bryanesmith / vectorized-forward-propagation.m
Created June 25, 2017 13:56
Forward propagation of a neural network classifier in Octave
%
% Assume network input layer of size s1,
% one hidden layer of size s2,
% and an output layer of size s3.
%
% Note a bias unit is added to input and hidden layers.
%
% Theta1 and Theta2 are pre-calculated weights.
% (This example just performs forward propogation.)
%
@bryanesmith
bryanesmith / logistic-regression-regularized.m
Created July 9, 2017 21:17
Regularized logistic regression cost function and gradients
% Given:
% X ϵ R^m,n
% y ϵ R^m,1
% - - - - - - - - - - - - - - -
% X ϵ R^m,n
% theta ϵ R^n,1
% - - - - - - - - - - - - - - -
function h = calcH(X, theta)
h = X * theta;
@bryanesmith
bryanesmith / context.json
Last active October 5, 2017 21:41
Sample unsecured Open Badge Extended Transcript extension
{
"@context": "https://purl.imsglobal.org/ctx/extended-transcript/v1p0",
"obi:validation": [
{
"obi:validatesType": "obi:extensions/ExtendedTranscript",
"obi:validationSchema": "https://purl.imsglobal.org/ctx/extended-transcript/v1p0/schema.json"
}
]
}
@bryanesmith
bryanesmith / context.json
Last active September 15, 2017 03:40
Sample secured Open Badge Extended Transcript extension using JWE
{
"@context": { ... },
"obi:validation": [
{
"obi:validatesType": "obi:extensions/SecuredExtendedTranscript",
"obi:validationSchema": "https://openbadgespec.org/extensions/securedExtendedTranscript/schema.json"
}
]
}