Skip to content

Instantly share code, notes, and snippets.

View johnny5550822's full-sized avatar

King Chung (Johnny) Ho johnny5550822

  • UCLA
  • Los Angeles
View GitHub Profile
@johnny5550822
johnny5550822 / pg-pong.py
Created June 2, 2016 00:41 — forked from karpathy/pg-pong.py
Training a Neural Network ATARI Pong agent with Policy Gradients from raw pixels
""" Trains an agent with (stochastic) Policy Gradients on Pong. Uses OpenAI Gym. """
import numpy as np
import cPickle as pickle
import gym
# hyperparameters
H = 200 # number of hidden layer neurons
batch_size = 10 # every how many episodes to do a param update?
learning_rate = 1e-4
gamma = 0.99 # discount factor for reward
@johnny5550822
johnny5550822 / gist:6785f75611e00ba88498
Last active October 22, 2015 19:16
A simple demo to train a CNN for digit classification with gpu
-- ###############
-- a version with GPU support. Assuming your gpu is working properly.
-- This is a minimal implementation (e.g., no batch processing) of a CNN on digit (0-9) classification. This code is largely obtained from luacnn, https://github.com/hpenedones/luacnn
-- The training data can be also obtained in https://github.com/hpenedones/luacnn
-- ###############
require "torch"
@johnny5550822
johnny5550822 / char-rnn nba-archive.md
Last active November 9, 2015 23:45
An automatic generated nba archive by char-rnn

Can you imagine a computer can generate an nba article?

The following is some nba articles fully-automatically generated by char-cnn, a recurrent-neural-network library thanks to Andrej Karpathy [link]. The library is awesome to easy, and very user-friendly. You should try it! :)

Basically, I wrote a python script [link] to extract past archives . And use that as the training set for the recurrent neural network.

The articles below are generated by a network trained with rougly about 2 millions character (which is an okay size; not big enough though). You can see that the generated article contains artificial author names, speeches, etc; similar to an nba archive (although the logic has to be improved, it is FUN.)

You can tune the parameter and train with a even bigger dataset using my script. And you will probably get better result! Have fun:)

@johnny5550822
johnny5550822 / parse_espn_nba_archive.py
Last active August 29, 2015 14:27
A sample script to automatically parse the NBA news archive in ESPN.com (only text).
# A simple script to parse ESPN nba archive in a specified year with specified months, Code updated: 08/20/2015
import urllib2
from bs4 import BeautifulSoup # a very good python library for parsing html
import time
import re
# parameters
year = 2015
months = ['january','August'] # the service is only available in these two months
@johnny5550822
johnny5550822 / gist:aaeab402279315057a61
Last active August 29, 2015 14:27 — forked from karpathy/gist:587454dc0146a6ae21fc
An efficient, batched LSTM.
"""
This is a batched LSTM forward and backward pass
"""
import numpy as np
import code
class LSTM:
@staticmethod
def init(input_size, hidden_size, fancy_forget_bias_init = 3):
@johnny5550822
johnny5550822 / min-char-rnn.py
Last active August 29, 2015 14:27 — forked from karpathy/min-char-rnn.py
Minimal character-level language model with a Vanilla Recurrent Neural Network, in Python/numpy
"""
Minimal character-level Vanilla RNN model. Written by Andrej Karpathy (@karpathy)
BSD License
"""
import numpy as np
# data I/O
data = open('input.txt', 'r').read() # should be simple plain text file
chars = list(set(data))
data_size, vocab_size = len(data), len(chars)