This is a tiny update to https://gist.github.com/joschu/a21ed1259d3f8c7bdff178fb47bc6fc1#file-1-cem-v0-writeup-md
- I ran experiments on the v1 mujoco environments
- I reduced the added noise
extra_std
parameter from0.01
to0.001
I used the cross-entropy method (an evolutionary algorithm / derivative free optimization method) to optimize small two-layer neural networks.
Code used to obtain these results can be found at the url https://github.com/joschu/modular_rl, commit ba42955b41d7f419470a95d875af1ab7e7ee66fc. The command line expression used for all the environments can be found in the text file below. Note that the same exact parameters were used for all tasks. The important parameters are:
hid_sizes=10,5
: hidden layer sizes of MLPextra_std=0.001
: noise added to variance, see [1]batch_size=200
: number of episodes per batchseed=0
random seed.
The program is single-threaded and deterministic. I used float32
precision, with THEANO_FLAGS=floatX=float32
.
The following instructions commands will let you conveniently run all of the experiments at once.
- Find a computer with many cpus.
- If it's a headless computer,
sudo apt-get install xvfb
. Then typexvfb-run /bin/bash -s "-screen 0 1400x900x24"
to enter a shell where all your commands will benefit from a fake monitor provided by xvfb. - Navigate into the
modular-rl
directory. export THEANO_FLAGS=floatX=float32; export outdir=/YOUR/PATH/HERE; export NUM_CPUS=YOUR_NUMBER_OF_CPUS
- Move
2-cem-scripts.txt
into themodular-rl
directory - Run all experiments with the following command
cat 2-cem-scripts.txt | xargs -n 1 -P $NUM_CPUS bash -c
.
You can also set --video=0
in these scripts to disable video recording. If video is disabled, you won't need the xvfb commands.
[1] Szita, István, and András Lörincz. "Learning Tetris using the noisy cross-entropy method." Neural computation 18.12 (2006): 2936-2941.