Skip to content

Instantly share code, notes, and snippets.

View HaleTom's full-sized avatar
💭
Using human learning for machine learning

Tom Hale HaleTom

💭
Using human learning for machine learning
View GitHub Profile
Verifying that +tomhale is my blockchain ID. https://onename.com/tomhale
@HaleTom
HaleTom / bash.bashrc
Created August 12, 2016 02:40
bash.bashrc from Linux Mint 17.2
# System-wide .bashrc file for interactive bash(1) shells.
# To enable the settings / commands in this file for login shells as well,
# this file has to be sourced in /etc/profile.
# If not running interactively, don't do anything
[ -z "$PS1" ] && return
# don't put duplicate lines or lines starting with space in the history.
# See bash(1) for more options
# Print the name of the git repository's working tree's root directory
# Search for 'Tom Hale' in http://stackoverflow.com/questions/957928/is-there-a-way-to-get-the-git-root-directory-in-one-command
# Or, shorter:
# (root=$(git rev-parse --git-dir)/ && cd ${root%%/.git/*} && git rev-parse && pwd)
# but this doesn't cover external $GIT_DIRs which are named other than .git
function git_root {
local root first_commit
# git displays its own error if not in a repository
root=$(git rev-parse --show-toplevel) || return
if [[ -n $root ]]; then
@HaleTom
HaleTom / gem-patch
Last active August 20, 2016 10:27
Patch gems then build them
#!/bin/bash
# Usage: gem-patch
# Code updates: https://gist.github.com/HaleTom/275f28403828b9b9b93d313990fc94f4
# Features:
# Work around `patch` returning non-zero if some patch hunks are already applied
# Apply all patches in $patch_dir (in order) to their corresponding gem(s)
# Build a gem only after all patches have been applied
# Only build the gem if it was patched
# NO LONGER UPDATED
# Merged into: https://gist.github.com/HaleTom/631efef6fb6ae86618128647dc887aee
##################
# Set the prompt #
##################
# Select git info displayed, see /usr/lib/git-core/git-sh-prompt for more
export GIT_PS1_SHOWDIRTYSTATE=1 # '*'=unstaged, '+'=staged
export GIT_PS1_SHOWSTASHSTATE=1 # '$'=stashed
@HaleTom
HaleTom / update_table
Created October 4, 2016 09:06
Update vim's ":help compatible" table from source code extract
#!/bin/bash -eu
table_to_update=./compatible_table
source_extract=./option_extract
function get_decorated_options {
# option_extract > while read -r option;
while read -r line; do
name=$(printf %s "$line" | sed -rn 's/^\s*\{\s*"(\w+).*/\1/p')
# Get decoration
@HaleTom
HaleTom / pacman-backup
Last active March 9, 2017 11:42
Safely backup pacman's sync databases
#!/bin/bash
# Safely backup the pacman databases to enable reversal of system upgrade.
# Use pacman -b <backupdirectory> to use the saved databses
# Latest version: https://gist.github.com/HaleTom/9dbffaf3369b86ca272ffe6a61a36aba
set -euo pipefail; shopt -s failglob # safe mode
db_lock=/var/lib/pacman/db.lck
@HaleTom
HaleTom / Test.ipynb
Last active November 25, 2017 15:09
Test .ipynb
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@HaleTom
HaleTom / describe.m
Last active July 9, 2019 08:27
Matlab/Octave: describe a variable (name, type, size)
% Based upon https://stackoverflow.com/a/45347880/5353461
% Gist at: https://gist.github.com/HaleTom/533b0ed7c51f93bfb5f71007a188bac4
function varargout = describe(varargin)
% varargin used to accommodate variable number of input names
st = dbstack;
outstring = '';
for ii = size(st, 1):-1:2
outstring = [outstring, st(ii).file, ' > ', st(ii).name, ', line ', num2str(st(ii).line), '\n'];
end
@HaleTom
HaleTom / nnCostFunction.m
Created July 9, 2019 08:34
Matlab gradient descent - Coursera's Machine Learning ex4/nnCostFunction.m
function [J grad] = nnCostFunction(nn_params, ...
input_layer_size, ...
hidden_layer_size, ...
num_labels, ...
X, Y, lambda)
%NNCOSTFUNCTION Implements the neural network cost function for a two layer
%neural network which performs classification
% [J grad] = NNCOSTFUNCTON(nn_params, hidden_layer_size, num_labels, ...
% X, y, lambda) computes the cost and gradient of the neural network. The
% parameters for the neural network are "unrolled" into the vector