This gist assumes that:
- you are already using OpenAI services via the OpenAI API
- you want to migrate to an Azure-based workflow
- somebody else has already set up the Azure endpoint for you
from openai import OpenAI
client = OpenAI(
# Defaults to os.environ.get("OPENAI_API_KEY")
)
chat_completion = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[{"role": "user", "content": "Hello world"}]
)
You now just need to change the instantiation of the client
to the following:
from openai import AzureOpenAI
client = AzureOpenAI(
azure_endpoint=<AZURE_ENDPOINT>,
api_key=<AZURE_KEY>,
api_version=<API_VERSION>,
)
The AzureOpenAI client class can then be used in the same way as the regular OpenAI client class.
Tip
It's a good idea to still use environmental variables and config files when dealing with these keys.
I recommend using config files and environmental variables as to avoid accidentally leaking these credentials (which would allow others to abuse the resource).
My recommended workflow:
- Create a folder
.cfg
in your home directory. (mkdir ~/.cfg
) - Create a file
openai.cfg
in~/.cfg/openai.cfg
(touch ~/.cfg/openai.cfg
) - Copy the credentials I send to you into
openai.cfg
in the following format:
[AZURE]
key=...
endpoint=...
Where key
will be a string of letters and numbers endpoint
will be a URL.
You can now use the configparser
module in Python to read these values into your script (see attached script below).
Useful context: the .cfg file should be formatted like this: