# install ollama
brew install --cask ollama
# install continue.dev
code --install-extension Continue.continue
# general purpose llm
ollama run llama2
# coding llm
ollama run deepseek-coder:6.7b
# autocomplete llm
ollama run starcoder2:3b
- Select the Continue extension from the primary side bar
- Click the settings cog on the bottom right-hand side of the Continue column
- Paste the
config.json
settings (cf. gist)
"""
: multiline message/bye
: quit llm
ollama/ollama: Get up and running with Llama 2, Mistral, Gemma, and other large language models.