Skip to content

Instantly share code, notes, and snippets.

@luiscape
Last active November 29, 2022 00:05
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save luiscape/36a8cd29b8ed54cfbfcf56d51fe23cc0 to your computer and use it in GitHub Desktop.
Save luiscape/36a8cd29b8ed54cfbfcf56d51fe23cc0 to your computer and use it in GitHub Desktop.
Download Stable Diffusion + `EulerAncestralDiscreteScheduler` and store pipeline
"""This script downloads all neural networks used in the HuggingFace `diffuser`'s
`StableDiffusionPipeline` pipeline. This also downloads the parameters from the scheduler
`EulerAncestralDiscreteScheduler` because that is a fast and effective scheduler.
This requires the env var `HUGGINGFACE_TOKEN` to be populated with a HuggingFace
access token.
The default cache location is: /vol/cache. This can be changed by populating
the environment variable `CACHE_PATH`.
All of that data is downloaded into a target cache directory.
Usage:
$ python download_weights.py
"""
import os
import torch
import diffusers
model_id = "runwayml/stable-diffusion-v1-5"
hugging_face_token = os.environ["HUGGINGFACE_TOKEN"]
cache_path = os.environ.get("CACHE_PATH", "/vol/cache")
def download_models():
# Download the Euler A scheduler configuration. Faster than the default and
# high quality output with fewer steps.
euler = diffusers.EulerAncestralDiscreteScheduler.from_config(
model_id,
subfolder="scheduler",
use_auth_token=hugging_face_token,
cache_dir=cache_path)
euler.save_config(cache_path)
# Downloads all other models.
pipe = diffusers.StableDiffusionPipeline.from_pretrained(
model_id,
use_auth_token=hugging_face_token,
revision="fp16",
torch_dtype=torch.float16,
cache_dir=cache_path)
pipe.save_pretrained(cache_path)
if __name__ == "__main__":
download_models()
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment