Example curl requests for OpenAI's function calling API using a local Ollama docker container with the llama3.1 model.
Step 1:
Ask the LLM a question and provide the function it can call to gather the required information.
In this case, we want to know the current weather and provide the get_current_weather
function definition.
Note: If you use a local Ollama instance, you can remove the "Authorization"-header.