Skip to content

Instantly share code, notes, and snippets.

@ashmalvayani
Created April 18, 2024 17:27
Show Gist options
  • Save ashmalvayani/83c2daa316f68622ecc6e53dbe09fa51 to your computer and use it in GitHub Desktop.
Save ashmalvayani/83c2daa316f68622ecc6e53dbe09fa51 to your computer and use it in GitHub Desktop.
"triu_tril_cuda_template" not implemented for 'BFloat16'
## One of the two worked for me, not sure so writing it both:
pip uninstall flash-attn
FLASH_ATTENTION_FORCE_BUILD=TRUE pip install flash-attn
## Upgrading the torch version to 2.1.0 with this command:
conda install pytorch==2.1.0 torchvision==0.16.0 torchaudio==2.1.0 pytorch-cuda=11.8 -c pytorch -c nvidia
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment