Skip to content

Instantly share code, notes, and snippets.

@WAUthethird
Last active February 13, 2024 07:10
Show Gist options
  • Star 6 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save WAUthethird/658581cc08b06ac77c64181fb3bd5a86 to your computer and use it in GitHub Desktop.
Save WAUthethird/658581cc08b06ac77c64181fb3bd5a86 to your computer and use it in GitHub Desktop.
Using vast.ai with OpenAI Jukebox

Using vast.ai with OpenAI Jukebox

vast.ai is an easy-to-use and comparatively cheap service that allows users to loan GPU compute and processing power to others for a much lower hourly price than almost every other cloud provider. In this guide, I will walk you through the steps of using vast.ai to quickly generate content using OpenAI Jukebox - as the GPUs available can be much faster than the ones provided by Google Colab, while still being reasonably priced.

The .ipynb notebook is based off of this one on Colab: https://colab.research.google.com/github/SMarioMan/jukebox/blob/master/jukebox/Interacting_with_Jukebox.ipynb

  1. Go to vast.ai and create an account. You will also need to provide your payment info.
    • You may or may not recieve the "few minutes of trial credit" advertised in the top right - if this occurs, you can talk to them through their live chat.
  2. You will need to configure the Instance Configuration. Make sure you are looking at the "Create" page, and look to the left of the list of GPUs. Set your disk space slider to around 32GB, then press "Edit Image & Config..."
  3. You will see a variety of options. Scroll down and press the "Select" button on the other side of pytorch/pytorch. Make sure you have the jupyter-python-notebook setting enabled, and expand the dropdown menu above it. You will see quite a few Docker version tags. Select (THIS IS VERY IMPORTANT) 1.10.0-cuda-11.3-cudnn8-runtime. Press "Select" at the bottom.
  4. Before adding credit, we will need to consider your needs. How much and for how long will you use the service? Are you planning to upsample your songs or create long songs? Depending on the answer, you will need to add the approximate amount of credit you deem necessary for your use case.
    • When signed in (or even when signed out!) you will see a list of GPU-accelerated machines when you are in the "Create" tab. When a machine is greyed out, that means another person is using it, so it could become available in the future. To see a full list of available/unavailable machines, check "Include Unverified Machines" and "Include Incompatible Machines" in the "Filter offers" section. You can also sort by price as well.
    • Here is my list of recommended GPU configurations that should work with OpenAI Jukebox:
      • 1x V100 (32GB ver)
      • 1x Titan RTX
      • 1x RTX 3090
      • 1x RTX 3090 Ti
      • 1x RTX 4090
      • 1x Quadro RTX 6000
      • 1x Quadro RTX 8000
      • 1x Quadro RTX A4500
      • 1x Quadro RTX A5000 (desktop)
      • 1x Quadro RTX A5500
      • 1x Quadro RTX A6000
      • 1x A10
      • 1x A30
      • 1x A40
      • 1x A100
      • 1x H100
    • I do not recommend using 16GB of VRAM with OpenAI Jukebox. The ones above have 24GB or more of VRAM. (As for why I haven't included multi-GPU configs, this is because Jukebox does not generate on multi-GPU setups, to my knowledge, and would only utilize one GPU instead of both/multiple.)
    • Here is my list of minimum-spec (16GB) GPU configurations that should work with OpenAI Jukebox:
      • 1x RTX 4090 (mobile)
      • 1x RTX 4080
      • 1x RTX 3080 Ti (mobile)
      • 1x Quadro RTX 5000
      • 1x Tesla T4
      • 1x Titan V
      • 1x Tesla V100 (16GB ver)
      • 1x Quadro RTX 5000
      • 1x Quadro RTX A4000 (desktop)
      • 1x A2
  5. Now that you have everything set up, find an instance with a price and network speed that seems reasonable to you (you will want yours to have a good download speed as the 10GB model download will likely take a while) and press "Rent".
  6. Go to the "Instances" tab and you'll see your machine being set up! You may need to wait a few minutes for it to set everything up on its end, but eventually you'll be able to connect to it.
  7. A new tab should pop open in your browser (if it displays some sort of network error, try reloading - it may still be doing some last-minute setup) and you'll see a filesystem pop up. Ignore what's in there and upload the .ipynb file linked below. (or click this link to download, right click the page and "Save As" - it may try to append a .txt at the end of the filename, delete it before saving or after saving)
    • Note: you'll need to download this file to your local machine, and then press the "Upload" button on the filesystem page. Then, you'll need to press "Upload" next to the file itself, and wait for it to upload. Once done, you can click on the name of the newly-uploaded file and it'll open a new tab with the notebook. It should be quite similar to Google Colab. (Further instructions will be in there)

Additional Notes

  • You should not close out the notebook connection or page, even though it will still be running. I haven't figured out a way to reliably reconnect to an existing notebook kernel, so you will probably need to have it open for as long as you're using it, just like Colab.
  • If you want to run the notebook locally, you will need an Anaconda environment with Python 3.7 and the last version of PyTorch 1.
Display the source blob
Display the rendered blob
Raw
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment