Skip to content

Instantly share code, notes, and snippets.

@anhldbk
Last active April 7, 2024 00:11
Show Gist options
  • Star 1 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save anhldbk/8ef2d465152dd4b31429725f4534603f to your computer and use it in GitHub Desktop.
Save anhldbk/8ef2d465152dd4b31429725f4534603f to your computer and use it in GitHub Desktop.

LiteLLM: Bypass Cert Verifications

Overview

If you use LiteLLM to proxy requests to Ollama.ai in corporate environments, you may encounter the following error in your Python application:

httpcore.ConnectError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self-signed certificate in certificate chain (_ssl.c:1006)

Root cause

The error is understandable since in such environments, corpororates always perform man-in-the-middle mechanism to monitor network traffic ins and outs.

Troubleshooting

Disable cert verification. If you're not happy with other results from Google, this snippet may work:

import httpx
import openai
import os

api_base = "http://0.0.0.0:8000" # litellm server
os.environ["OPENAI_BASE_URL"] = api_base
openai.api_key = "temp-key"
openai.http_client = httpx.Client(verify=False)

client = openai.OpenAI(api_key="anything")
content="why is the sky blue?"
response = client.chat.completions.create(
  model="ollama/codellama",
  messages=[{
    "role": "user",
    "content": content
  }],
  stream=True
)

for chunk in response:
  print(chunk.choices[0].delta.content or "", end="")
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment