Skip to content

Instantly share code, notes, and snippets.

Avatar

masterdezign

View GitHub Profile
@masterdezign
masterdezign / gist:5eb03e834299b8b67750
Last active Nov 15, 2017
Remark.js+MathJax essentials
View gist:5eb03e834299b8b67750
<!DOCTYPE html>
<html>
<head>
<title>My Presentation</title>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/>
<style type="text/css">
.red { color: red }
</style>
</head>
<body>
@masterdezign
masterdezign / install-ghc.md
Created Dec 5, 2015 — forked from yantonov/install-ghc-ubuntu.md
How to install latest GHC 7.10.2 from source + stack 0.1.6.0 + cabal 1.22.4.0 + cabal-install 1.22.6.0 on ubuntu
View install-ghc.md

How to install latest GHC 7.10.2 from source + stack 0.1.6.0 + cabal 1.22.4.0 + cabal-install 1.22.6.0 on ubuntu

for your convinience these instuction is available as:
gist
git repo

settings

GHC_VERSION="7.10.2"  
ARCHITECTURE="x86_64"  
@masterdezign
masterdezign / dct.hs
Created Dec 5, 2015 — forked from vi/dct.hs
Simple command-line Discrete Cosine Transform in Haskell
View dct.hs
-- "cabal install vector-fftw split"
import qualified Numeric.FFT.Vector.Unnormalized as FFT
import Data.Vector (fromList, toList)
import Data.List.Split (splitOneOf)
import Data.List (intersperse)
import Control.Monad (forever)
import Control.Arrow ((>>>))
main = forever $
@masterdezign
masterdezign / CrazyIO.hs
Created Dec 19, 2015 — forked from nkpart/CrazyIO.hs
CrazyIO - binary deserialization using mmaped I/O and Data.Vector.Storable
View CrazyIO.hs
{-# LANGUAGE ScopedTypeVariables #-}
module CrazyIO (module CrazyIO, mmapFileByteString) where
import qualified Data.Vector.Storable as V
import qualified Data.ByteString as BS
import qualified Data.ByteString.Internal as BS
import Foreign
import System.IO.MMap
crazyLoad :: forall a. Storable a => FilePath -> Maybe (Int64, Int) -> IO (V.Vector a)
@masterdezign
masterdezign / min-char-rnn.py
Created Feb 17, 2016 — forked from karpathy/min-char-rnn.py
Minimal character-level language model with a Vanilla Recurrent Neural Network, in Python/numpy
View min-char-rnn.py
"""
Minimal character-level Vanilla RNN model. Written by Andrej Karpathy (@karpathy)
BSD License
"""
import numpy as np
# data I/O
data = open('input.txt', 'r').read() # should be simple plain text file
chars = list(set(data))
data_size, vocab_size = len(data), len(chars)
@masterdezign
masterdezign / xor_keras.py
Created Aug 31, 2016 — forked from cburgdorf/xor_keras.py
Comparing XOR between tensorflow and keras
View xor_keras.py
import numpy as np
from keras.models import Sequential
from keras.layers.core import Activation, Dense
from keras.optimizers import SGD
X = np.array([[0,0],[0,1],[1,0],[1,1]], "float32")
y = np.array([[0],[1],[1],[0]], "float32")
model = Sequential()
model.add(Dense(2, input_dim=2, activation='sigmoid'))
@masterdezign
masterdezign / network.py
Last active Jul 8, 2017
Backpropagation in Python
View network.py
from random import random
from numpy import tanh
from scipy import array, vectorize, transpose
class Network:
""" Class builds neural network
"""
def __init__(self, inputs, outputs, hlayers=None, activation=tanh,
learning_rate=1.):
@masterdezign
masterdezign / convex-hull.hs
Last active Aug 29, 2020
Convex hull algorithm in Haskell
View convex-hull.hs
-- Worth reading http://www.geeksforgeeks.org/convex-hull-set-2-graham-scan/
import Text.Printf
import Data.List
type Point = (Double, Double)
-- Euclidean distance
dist :: (Double, Double) -> (Double, Double) -> Double
dist (x1, y1) (x2, y2) = sqrt (f x1 x2 + f y1 y2)
@masterdezign
masterdezign / brainf.hs
Last active Aug 30, 2017
Brainf**k interpreter in Haskell.
View brainf.hs
{-
Brainf**k interpreter
Instructions:
> Increment data pointer so that it points to next location in memory.
< Decrement data pointer so that it points to previous location in memory.
+ Increment the byte pointed by data pointer by 1. If it is already at its maximum value, 255, then new value will be 0.
- Decrement the byte pointed by data pointer by 1. If it is at its minimum value, 0, then new value will be 255.
. Output the character represented by the byte at the data pointer.
, Read one byte and store it at the memory location pointed by data pointer.
View Grad.hs
-- Run a one dimensional gradient descent written in Clash [1],
-- a high level language that compiles to Verilog and VHDL.
-- Gradient descent can be described by a formula:
--
-- a_n+1 = a_n - gamma * Grad F(a_n),
--
-- where the constant `gamma` is what is referred to in deep learning as
-- the learning rate.
--
-- [1] https://clash-lang.org/
You can’t perform that action at this time.