Skip to content

Instantly share code, notes, and snippets.

@ceteri
Last active Jul 7, 2020
Embed
What would you like to do?
SELECT_ENV = "MountainCar-v0"
config = ppo.DEFAULT_CONFIG.copy()
config["log_level"] = "WARN"
config["num_workers"] = 4 # default = 2
config["train_batch_size"] = 10000 # default = 4000
config["sgd_minibatch_size"] = 256 # default = 128
config["evaluation_num_episodes"] = 50 # default = 10
agent = ppo.PPOTrainer(config, env=SELECT_ENV)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment