Skip to content

Instantly share code, notes, and snippets.

@p1nox
Last active December 17, 2023 02:27
Show Gist options
  • Star 9 You must be signed in to star a gist
  • Fork 1 You must be signed in to fork a gist
  • Save p1nox/f7b0d8ae415f14d235cb14359b48857e to your computer and use it in GitHub Desktop.
Save p1nox/f7b0d8ae415f14d235cb14359b48857e to your computer and use it in GitHub Desktop.
PyTorch on RTX 3060

After researching a lot on how to use PyTorch with a RTX 3060 card, specially with older versions or torch (0.4.0) and torchvision (0.2.1), I noticed that it was either impossible or very hard to do. RTX 3060 and these packages apparently doesn't have compatibility with the same versions of CUDA and cuDNN. I tried to do this by using different combinations with compiled versions available in conda, but didn't work, maybe it could work if you recompile from source these versions.

After all this, actually I was able to use RTX 3060 effectively with latest versions of all these dependencies with two methods:

  1. Using a conda env, and latest versions published in pytorch site (https://pytorch.org/get-started/locally):
conda create -n rtx_3060 python=3.6.5
conda activate rtx_3060
conda install pytorch torchvision torchaudio cudatoolkit=11.1 -c pytorch -c nvidia
  1. Using lambda-stack (https://lambdalabs.com/blog/install-tensorflow-and-pytorch-on-rtx-30-series/):
LAMBDA_REPO=$(mktemp) && \
wget -O${LAMBDA_REPO} https://lambdalabs.com/static/misc/lambda-stack-repo.deb && \
sudo dpkg -i ${LAMBDA_REPO} && rm -f ${LAMBDA_REPO} && \
sudo apt-get update && sudo apt-get install -y lambda-stack-cuda

Reboot your machine.

LAMBDA_REPO=$(mktemp) && \
wget -O${LAMBDA_REPO} https://lambdalabs.com/static/misc/lambda-stack-repo.deb && \
sudo dpkg -i ${LAMBDA_REPO} && rm -f ${LAMBDA_REPO} && \
sudo apt-get update && \
sudo apt-get --yes upgrade && \
sudo apt-get install --yes --no-install-recommends lambda-server && \
sudo apt-get install --yes --no-install-recommends nvidia-headless-455 && \
sudo apt-get install --yes --no-install-recommends lambda-stack-cuda

Reboot your machine.

*** The good thing about this method is that, if you have your current global environment messed up, lambda-stack is going to actually fix it.

Related to:

@bensonmunyan
Copy link

Hello! I'm currently trying to get this to work with Pytorch 1.7.0 and CUDA 10.2, but am not able to verify CUDA availability with torch. Any suggestions?

@vdkhvb
Copy link

vdkhvb commented Mar 19, 2022

Oof, after lots of re-installations of the driver and disabling secure boot I finally got this thing to work. Eventually ended up using Ubuntu's 'additional drivers' tool to install driver 510.54 / CUDA version 11.6. With version 11.4 I didn't manage to get it to work for some reason. Thanks for the script, I also found this one helpful for tracking version numbers of the installed libraries & drivers : https://raw.githubusercontent.com/pytorch/pytorch/master/torch/utils/collect_env.py

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment