Skip to content

Instantly share code, notes, and snippets.



View GitHub Profile
masterdezign /
Created Dec 5, 2015 — forked from yantonov/
How to install latest GHC 7.10.2 from source + stack + cabal + cabal-install on ubuntu

How to install latest GHC 7.10.2 from source + stack + cabal + cabal-install on ubuntu

for your convinience these instuction is available as:
git repo


masterdezign / dct.hs
Created Dec 5, 2015 — forked from vi/dct.hs
Simple command-line Discrete Cosine Transform in Haskell
View dct.hs
-- "cabal install vector-fftw split"
import qualified Numeric.FFT.Vector.Unnormalized as FFT
import Data.Vector (fromList, toList)
import Data.List.Split (splitOneOf)
import Data.List (intersperse)
import Control.Monad (forever)
import Control.Arrow ((>>>))
main = forever $
masterdezign / CrazyIO.hs
Created Dec 19, 2015 — forked from nkpart/CrazyIO.hs
CrazyIO - binary deserialization using mmaped I/O and Data.Vector.Storable
View CrazyIO.hs
{-# LANGUAGE ScopedTypeVariables #-}
module CrazyIO (module CrazyIO, mmapFileByteString) where
import qualified Data.Vector.Storable as V
import qualified Data.ByteString as BS
import qualified Data.ByteString.Internal as BS
import Foreign
import System.IO.MMap
crazyLoad :: forall a. Storable a => FilePath -> Maybe (Int64, Int) -> IO (V.Vector a)
masterdezign /
Created Feb 17, 2016 — forked from karpathy/
Minimal character-level language model with a Vanilla Recurrent Neural Network, in Python/numpy
Minimal character-level Vanilla RNN model. Written by Andrej Karpathy (@karpathy)
BSD License
import numpy as np
# data I/O
data = open('input.txt', 'r').read() # should be simple plain text file
chars = list(set(data))
data_size, vocab_size = len(data), len(chars)
masterdezign /
Created Aug 31, 2016 — forked from cburgdorf/
Comparing XOR between tensorflow and keras
import numpy as np
from keras.models import Sequential
from keras.layers.core import Activation, Dense
from keras.optimizers import SGD
X = np.array([[0,0],[0,1],[1,0],[1,1]], "float32")
y = np.array([[0],[1],[1],[0]], "float32")
model = Sequential()
model.add(Dense(2, input_dim=2, activation='sigmoid'))
masterdezign /
Last active Jul 8, 2017
Backpropagation in Python
from random import random
from numpy import tanh
from scipy import array, vectorize, transpose
class Network:
""" Class builds neural network
def __init__(self, inputs, outputs, hlayers=None, activation=tanh,
masterdezign / brainf.hs
Last active Aug 30, 2017
Brainf**k interpreter in Haskell.
View brainf.hs
Brainf**k interpreter
> Increment data pointer so that it points to next location in memory.
< Decrement data pointer so that it points to previous location in memory.
+ Increment the byte pointed by data pointer by 1. If it is already at its maximum value, 255, then new value will be 0.
- Decrement the byte pointed by data pointer by 1. If it is at its minimum value, 0, then new value will be 255.
. Output the character represented by the byte at the data pointer.
, Read one byte and store it at the memory location pointed by data pointer.
masterdezign / gist:5eb03e834299b8b67750
Last active Nov 15, 2017
Remark.js+MathJax essentials
View gist:5eb03e834299b8b67750
<!DOCTYPE html>
<title>My Presentation</title>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/>
<style type="text/css">
.red { color: red }
View Grad.hs
-- Run a one dimensional gradient descent written in Clash [1],
-- a high level language that compiles to Verilog and VHDL.
-- Gradient descent can be described by a formula:
-- a_n+1 = a_n - gamma * Grad F(a_n),
-- where the constant `gamma` is what is referred to in deep learning as
-- the learning rate.
-- [1]
masterdezign / Grad.hs
Last active Dec 10, 2018
A brief gradient descent illustration in Haskell
View Grad.hs
descent1D gradF iterN gamma x0 = take iterN (iterate _descend x0)
_descend gamma' x = x - gamma' * gradF x
-- Suppose, we have a function F(x) = (x - 3)^2.
-- Therefore, Grad F(x) = 2 * (x - 3).
gradF_test x = 2 * (x - 3)
main = print (descent1D gradF_test 10 gamma 0.0)
where gamma = 0.5
You can’t perform that action at this time.