Skip to content

Instantly share code, notes, and snippets.

@fakerybakery
Last active April 12, 2024 19:44
Show Gist options
  • Save fakerybakery/074e55a7eb143328246fc3e905feeb29 to your computer and use it in GitHub Desktop.
Save fakerybakery/074e55a7eb143328246fc3e905feeb29 to your computer and use it in GitHub Desktop.
Get Flash Attention + Axolotl + Torch to work on CUDA 12.1

Get Flash Attention + Axolotl + Torch to work on CUDA 12.1

This fixed it for me

pip uninstall -y torch flash-attn
conda install pytorch==2.1.2 torchvision torchaudio pytorch-cuda=12.1 -c pytorch -c nvidia
pip install -U git+https://github.com/Dao-AILab/flash-attention
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment