Forked from mberman84/gist:9e008131d96af27256cc9cb53ad834cf
Last active
November 28, 2023 02:00
-
-
Save pringlized/9acf8b6cc04ca0f0fb93aa90d1e69fb1 to your computer and use it in GitHub Desktop.
CodeLLaMA Installation - Windows & WSL2
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# | |
# I have successfully run this with Anaconda on Windows directly & on WSL2/Ubuntu 22.04 with MiniConda | |
# | |
# Make sure you have Anaconda installed | |
# This tutorial assumes you have an Nvidia GPU, but you can find the non-GPU version on the Textgen WebUI github | |
# More information found here: https://github.com/oobabooga/text-generation-webui | |
# Make sure you have the CUDA 11.8 & cuDNN 11.x installed | |
# CUDA Toolkit 11.8: https://developer.nvidia.com/cuda-18-7-0-download-archive | |
# cuDNN 11.x: https://developer.nvidia.com/rdp/cudnn-download | |
# Create a conda environment | |
conda create -n tg python=3.10.9 | |
conda activate tg | |
# Install pytorch with GPU support | |
conda install pytorch torchvision torchaudio pytorch-cuda=11.8 -c pytorch -c nvidia | |
# Install text-generation-webui | |
git clone https://github.com/oobabooga/text-generation-webui | |
cd text-generation-webui | |
python -m pip install -r requirements.txt | |
# check to make sure CUDA is enabled | |
python -m torch.utils.collect_env | |
# To test in the interpreter | |
$ python | |
>>> import torch | |
>>> torch.cuda.is_available() | |
>>> torch.zeros(1).cuda() | |
# Start a local server, then click on the 'Running on local URL: http://127.0.0.1:xxxx' | |
python server.py | |
# After the page comes up, click on Model in the top nav | |
# Go get this Model: https://huggingface.co/TheBloke/WizardCoder-Python-7B-V1.0-GPTQ |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment