Skip to content

Instantly share code, notes, and snippets.

@BrianDo2005
BrianDo2005 / pydata_dc_2016_vi_in_python.ipynb
Created June 4, 2018 00:23 — forked from AustinRochford/pydata_dc_2016_vi_in_python.ipynb
PyData DC 2016 Variational Inference in Python
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@BrianDo2005
BrianDo2005 / walkthrough.md
Created June 2, 2018 20:14 — forked from Alexhuszagh/walkthrough.md
NEM NIS API Walkthrough
import time
from keras.models import Sequential
from keras.layers.core import Dense, Dropout, Activation, Reshape, Flatten
from keras.layers import Conv2D, Conv2DTranspose
from keras.optimizers import RMSprop, from keras import regularizers
def init_model():
start_time = time.time()
print 'Compiling Model ... '
model = Sequential()
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@BrianDo2005
BrianDo2005 / dropout_bayesian_approximation_tensorflow.py
Created March 29, 2018 05:37 — forked from VikingPenguinYT/dropout_bayesian_approximation_tensorflow.py
Implementing Dropout as a Bayesian Approximation in TensorFlow
import numpy as np
import tensorflow as tf
import matplotlib.pyplot as plt
from tensorflow.contrib.distributions import Bernoulli
class VariationalDense:
"""Variational Dense Layer Class"""
def __init__(self, n_in, n_out, model_prob, model_lam):
self.model_prob = model_prob
import numpy as np
from keras.layers import Input, Dense
from keras.models import Model
from keras.datasets import mnist
import matplotlib.pyplot as plt
# this is the size of our encoded representations
encoding_dim = 32 # 32 floats -> compression of factor 24.5, assuming the input is 784 floats
# this is our input placeholder
from keras.layers import Input, Dense, Conv2D, MaxPooling2D, UpSampling2D
from keras.models import Model
from keras import backend as K
from keras.datasets import mnist
import numpy as np
from keras.callbacks import TensorBoard
import matplotlib.pyplot as plt
input_img = Input(shape=(28, 28, 1)) # adapt this if using `channels_first` image data format
from keras.layers import Input, Dense, Conv2D, MaxPooling2D, UpSampling2D
from keras.models import Model
from keras.callbacks import TensorBoard
from keras.datasets import mnist
import numpy as np
from keras import backend as K
import matplotlib.pyplot as plt
input_img = Input(shape=(28, 28, 1)) # adapt this if using `channels_first` image data format
@BrianDo2005
BrianDo2005 / min-char-rnn.py
Created March 10, 2018 06:21 — forked from karpathy/min-char-rnn.py
Minimal character-level language model with a Vanilla Recurrent Neural Network, in Python/numpy
"""
Minimal character-level Vanilla RNN model. Written by Andrej Karpathy (@karpathy)
BSD License
"""
import numpy as np
# data I/O
data = open('input.txt', 'r').read() # should be simple plain text file
chars = list(set(data))
data_size, vocab_size = len(data), len(chars)
@BrianDo2005
BrianDo2005 / ml-recs.md
Created March 4, 2018 03:11 — forked from bsletten/ml-recs.md
Machine Learning Path Recommendations

This is an incomplete, ever-changing curated list of content to assist people into the worlds of Data Science and Machine Learning. If you have a recommendation for something to add, please let me know. If something isn't here, it doesn't mean I don't recommend it, I just may not have had a chance to review it yet or not.

I will generally list things in order of easier to more formal/challenging content.

It may feel like there is an overwhelming amount of stuff for you to learn (because there is). But, there is a guided path that will get you there in time. You need to focus on Linear Algebra, Calculus, Statistics and probably Python (or R). Your best bet is to get a Safari Books Online account (https://www.safaribooksonline.com) which you may already have access to through school or work. If not, it is a reasonable way to get access to a tremendous number of books and videos.

I'm not saying you will get what you need out of everything here, but I have read/watched at least some of all of the following an