This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
""" Trains an agent with (stochastic) Policy Gradients on Pong. Uses OpenAI Gym. """ | |
import numpy as np | |
import cPickle as pickle | |
import gym | |
# hyperparameters | |
H = 200 # number of hidden layer neurons | |
batch_size = 10 # every how many episodes to do a param update? | |
learning_rate = 1e-4 | |
gamma = 0.99 # discount factor for reward |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
package bbejeck.grouping | |
import org.apache.log4j.{Level, Logger} | |
import org.apache.spark.{SparkConf, SparkContext} | |
import scala.collection.mutable | |
/** | |
* Created by bbejeck on 8/6/15. | |
* Example usage of combineByKey |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
package bbejeck.grouping | |
import org.apache.log4j.{Level, Logger} | |
import org.apache.spark.{SparkConf, SparkContext} | |
import scala.collection.mutable | |
/** | |
* Created by bbejeck on 7/31/15. | |
* |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# XLA compilation controlled by "compile_ops" option | |
# compile_ops=False: 4.39 sec | |
# compile_ops=True: 0.90 sec | |
import os | |
os.environ['CUDA_VISIBLE_DEVICES']='' | |
import tensorflow as tf | |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import org.apache.spark.{AccumulableParam, SparkConf} | |
import org.apache.spark.serializer.JavaSerializer | |
import scala.collection.mutable.{ HashMap => MutableHashMap } | |
/* | |
* Allows a mutable HashMap[String, Int] to be used as an accumulator in Spark. | |
* Whenever we try to put (k, v2) into an accumulator that already contains (k, v1), the result | |
* will be a HashMap containing (k, v1 + v2). | |
* | |
* Would have been nice to extend GrowableAccumulableParam instead of redefining everything, but it's |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
// This program converts a set of vector<float>'s to a lmdb/leveldb by storing | |
// them as Datum proto buffers. | |
// Usage: | |
// convert_vector [FLAGS] LISTFILE DB_NAME | |
// | |
// where LISTFILE should be a list of files as well as the accompanying vector | |
// of floats, in the format as: | |
// subfolder1/file1.JPEG 0.2 0.3 0.1 0.25 0.15 | |
// .... |