Run Claude with the power of Local LLMs using Ollama Install Ollama curl -fsSL https://ollama.com/install.sh | sh Pull the Model ollama pull glm-4.7-flash # or gpt-oss:20b (for better performance)