Skip to content

Instantly share code, notes, and snippets.

View masterdezign's full-sized avatar

masterdezign

View GitHub Profile
Edit wpa_supplicant.conf and save it to the root of boot partition.
Replace the country code, ssid, and psk.
Full list of country codes: https://www.thinkpenguin.com/gnu-linux/country-code-list
On first boot, this file will be moved to /etc/wpa_supplicant/wpa_supplicant.conf where it may be edited later.
@masterdezign
masterdezign / hasktorch.sh
Last active July 27, 2020 12:50
Hasktorch with Nix
# First of all make sure Nix is up and running https://nixos.org/nix
# Clone Hasktorch
git clone https://github.com/hasktorch/hasktorch.git
cd hasktorch
NIX_PATH="nixpkgs=https://github.com/NixOS/nixpkgs/archive/release-19.09.tar.gz"
nix-env -iA cachix -f https://cachix.org/api/v1/install
cachix use hasktorch
@masterdezign
masterdezign / timelapse.sh
Created March 20, 2020 10:52
A video from images
# Concatenate images
ffmpeg -framerate 5 -pattern_type glob -i "*.jpg" output.mp4
# Compress
ffmpeg -i output.mp4 -vcodec libx265 -acodec aac -crf 23 output_compressed.mp4
@masterdezign
masterdezign / cellular.py
Created January 21, 2019 10:37
Cellular automata in Python
#!/usr/bin/env python3
"""
Cellular automata in Python
"""
import sys
Z = '.'
O = '#'
-- 0. Download x.data and y.dat from https://www.kaggle.com/masterdezign/iris-with-onehotencoded-targets
-- 1. Install stack (command line interface is marked by $):
-- $ wget -qO- https://get.haskellstack.org/ | sh
-- (alternatively, curl -sSL https://get.haskellstack.org/ | sh)
-- 2. Install open-blas from https://www.openblas.net/ (needed for hmatrix package)
-- 3. Run
-- $ stack --resolver lts-10.6 --install-ghc runghc --package hmatrix-0.18.2.0 Iris.hs
import Numeric.LinearAlgebra as LA
@masterdezign
masterdezign / Grad.hs
Last active December 10, 2018 13:23
A brief gradient descent illustration in Haskell
descent1D gradF iterN gamma x0 = take iterN (iterate _descend x0)
where
_descend gamma' x = x - gamma' * gradF x
-- Suppose, we have a function F(x) = (x - 3)^2.
-- Therefore, Grad F(x) = 2 * (x - 3).
gradF_test x = 2 * (x - 3)
main = print (descent1D gradF_test 10 gamma 0.0)
where gamma = 0.5
-- Run a one dimensional gradient descent written in Clash [1],
-- a high level language that compiles to Verilog and VHDL.
-- Gradient descent can be described by a formula:
--
-- a_n+1 = a_n - gamma * Grad F(a_n),
--
-- where the constant `gamma` is what is referred to in deep learning as
-- the learning rate.
--
-- [1] https://clash-lang.org/
@masterdezign
masterdezign / brainf.hs
Last active April 22, 2022 16:38
Brainf**k interpreter in Haskell.
{-
Brainf**k interpreter
Brainf**k is a Turing-complete programming language.
Instructions:
> Increment data pointer so that it points to next location in memory.
< Decrement data pointer so that it points to previous location in memory.
+ Increment the byte pointed by data pointer by 1. If it is already at its maximum value, 255, then new value will be 0.
- Decrement the byte pointed by data pointer by 1. If it is at its minimum value, 0, then new value will be 255.
@masterdezign
masterdezign / convex-hull.hs
Last active January 21, 2022 00:28
Convex hull algorithm in Haskell
-- Worth reading http://www.geeksforgeeks.org/convex-hull-set-2-graham-scan/
import Text.Printf
import Data.List
type Point = (Double, Double)
-- Euclidean distance
dist :: (Double, Double) -> (Double, Double) -> Double
dist (x1, y1) (x2, y2) = sqrt (f x1 x2 + f y1 y2)
@masterdezign
masterdezign / network.py
Last active July 8, 2017 16:38
Backpropagation in Python
from random import random
from numpy import tanh
from scipy import array, vectorize, transpose
class Network:
""" Class builds neural network
"""
def __init__(self, inputs, outputs, hlayers=None, activation=tanh,
learning_rate=1.):