-
-
Save princefr/1ac08507f1937ca8c6a54ed928d4d684 to your computer and use it in GitHub Desktop.
CTPU engines with PyTorch
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
gcloud config set compute/zone us-central1-f # change the compute zone. | |
ctpu up --tpu-size=v3-8 --machine-type n1-standard-8 # create a tpu v3 machine + 8 nodes | |
tpu succesfully created ? | |
connect to gcloud vm instance, if you don't have one create one | |
docker pull gcr.io/tpu-pytorch/xla:r0.5 # pull pytorch xla into the vm | |
docker run -it --shm-size 16G gcr.io/tpu-pytorch/xla:r0.5 | |
(pytorch) root@CONTAINERID:/$ export XRT_TPU_CONFIG="tpu_worker;0;$10.240.1.2:8470" | |
(pytorch) root@CONTAINERID:/$ python pytorch/xla/test/test_train_mnist.py | |
To launch the training |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment