Skip to content

Instantly share code, notes, and snippets.

What would you like to do?
doing torch-rnn via docker

##setting up torch-rnn using docker using this:

start docker (quickstart docker terminal)

get the image: docker run --rm -ti crisbal/torch-rnn:base bash

ok, great, now, tie my data to the image (but do this from the bash line, not from inside docker):

docker run -i -t -v /Users/shawngraham/generativenovel/data:/root/torch-rnn/data crisbal/torch-rnn:base

then preprocess:

python scripts/ \
--input_txt data/input.txt \
--output_h5 data/input.h5 \
--output_json data/input.json

and train:

th train.lua \
-input_h5 data/input.h5 \
-input_json data/input.json \
-gpu -1

I let it run for several thousands of iterations, but being impatient, I then stopped it. Now to sample!

th sample.lua -checkpoint cv/checkpoint_10000.t7 -length 2000 -gpu -1

or in my case:

th sample.lua -checkpoint cv/checkpoint_23000.t7 -length 200000 -gpu -1 -start_text "a novel of adventure of the tiny archaeologists" -temperature 0.675 -sample 1 > data/novel.txt which generates roughly 50 000 words of text.

aslo handy:

cp -R cv data/cv

came back and retrained it with this:

th train.lua -input_h5 input.h5 -input_json input.json -model_type rnn -num_layers 3 -rnn_size 256 -gpu -1

Copy link

hmmm. should probably run that container with --rm flag, as per ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment