Skip to content

Instantly share code, notes, and snippets.

@BoQsc
Last active March 27, 2023 08:30
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save BoQsc/f78e4584e7ef0099d4c5fb45f8568c97 to your computer and use it in GitHub Desktop.
Save BoQsc/f78e4584e7ef0099d4c5fb45f8568c97 to your computer and use it in GitHub Desktop.
Open Source and free. Locally run 7B "ChatGPT" model named Alpaca-LoRA on your computer. A 7 Billions parameter model on your laptop.

Locally run 7B "ChatGPT" model named Alpaca-LoRA on your computer.

  1. Download 7B model alpaca model.
  2. Download client-side program for Windows, Linux or Mac
  3. Extract alpaca-win.zip
  4. Copy the previously downloaded ggml-alpaca-7b-q4.bin file into newly extracted alpaca-win folder
  5. Open command prompt and run chat.exe
  6. Type questions you would want to ask an Alpaca.

Customize: chat --threads 4 --n_predict 512 --ctx_size 2048
Help: for more options. chat --help Try Custom config for 15GB RAM system: chat -n 100000 -c 15000 -b 80000

Note: Real ChatGPT is a 175B model. Note: I have seen Alpaca models containing 30B or even 60B parameters. Note: Recently there were announcement about making a cleaner version of Alpaca dataset which might significantly reduce hallucinations.

Chatting on a Laptop: Legion Y520 | 15.6"

2023-03-25.12-03-35.mp4

image

image

@BoQsc
Copy link
Author

BoQsc commented Mar 27, 2023

The datasets used by the original Alpaca AI
Also a link to original research paper of LLaMA

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment