Skip to content

Instantly share code, notes, and snippets.

@ruvnet

ruvnet/main.py Secret

Last active November 3, 2023 14:30
  • Star 8 You must be signed in to star a gist
  • Fork 1 You must be signed in to fork a gist
Star You must be signed in to star a gist
Embed
What would you like to do?
Function calling in GPT-4-0613 and GPT-3.5-turbo-0613 enables developers to describe functions using JSON Schema. It intelligently invokes functions, retrieves structured data, and facilitates seamless integration with external tools and APIs. Natural language queries can be converted to API or database calls, and structured data can be extracte…

OpenAI API Function Calling Example

This script demonstrates the usage of OpenAI API Function Calling feature. It showcases how to utilize function descriptions, customizing the API, and provides various use cases for OpenAI API Function Calls.

Customizing the API

To customize the API for your specific use case, you can follow these steps:

  1. Clone Copy main.py
  2. Install the necessary dependencies.
  3. Set up your OpenAI API credentials: Create an OpenAI API key and set it as the value for OPENAI_API_KEY & WEATHER_API_KEY environment variable.
  4. Customize the code: Modify the function descriptions, add your own functions, or adapt the existing functions to suit your needs.
  5. Run the application: Execute the script by running python main.py.
  6. Interact with the application: Follow the prompts and provide input to test the OpenAI API Function Calls.

Feel free to explore and modify the code to fit your requirements!

Use Cases for OpenAI API Function Calls

OpenAI API Function Calls offer a range of use cases to enhance conversational AI. Some notable examples include:

Weather Information Retrieval

This is the default example provided by OpenAi, so I thought I'd use it. By utilizing OpenAI API Function Calls, you can create weather information retrieval systems. The model can call external weather APIs based on user inputs, such as location and temperature unit, and provide up-to-date weather details including temperature, description, and more.

Multi-Turn Dialogs

With OpenAI API Function Calls, you can create interactive multi-turn dialogs. The model can maintain context and carry out meaningful conversations with users. You can set up conversational flows where the model responds appropriately based on user inputs and past interactions.

Chain of Thought

OpenAI API Function Calls enable a chain of thought, allowing the model to reason and process complex sequences of instructions. Users can provide step-by-step guidance, and the model can intelligently follow and execute those instructions, providing accurate responses based on the context.

Reasoning and Inference

OpenAI API Function Calls empower the model to perform reasoning and inference tasks. Users can pose questions or present scenarios, and the model can utilize external tools or APIs to gather relevant information, analyze data, and provide insightful responses.

These are just a few examples of the versatility and power of OpenAI API Function Calls. Explore the possibilities and leverage this feature to build intelligent conversational systems, interactive applications, and data-driven solutions.

Please note that the code in this repository is provided as an example and may require customization and adaptation to suit your specific use case.

# OpenAi Ai Weather - API Function Calls
# /\__/\ - main.py
# ( o.o ) - v0.0.1
# >^< - by @rUv
# Import necessary modules
import argparse
import openai
import requests
import json
import os
# Setup environment variables for OpenAI and WeatherAPI keys
openai.api_key = os.environ['OPENAI_KEY']
weather_api_key = os.environ['WEATHER_API_KEY']
# Create a new argument parser
parser = argparse.ArgumentParser(description="Get the current weather in a given location")
# Add the location argument to the parser. This is the location for which we want to fetch weather
parser.add_argument('--location', type=str, help="The city and state, e.g. San Francisco, CA")
# Parse the arguments passed to the script
args = parser.parse_args()
# Describe the 'get_current_weather' function in the function_description dictionary
# This dictionary specifies the function's name, description, parameters and their types
function_description = {
"name": "get_current_weather",
"description": "Get the current weather in a given location",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA"
},
"unit": {
"type": "string",
"enum": ["celsius", "fahrenheit"]
}
},
"required": ["location"]
}
}
# Define the function that fetches weather
def fetch_weather(location):
# First, we call the OpenAI API with a chat model, the user's message, and our function description
response_1 = openai.ChatCompletion.create(
model="gpt-3.5-turbo-0613",
messages=[
{"role": "user", "content": f"What is the weather like in {location}?"}
],
functions=[function_description]
)
# The assistant's response includes a function call. We extract the arguments from this function call
function_call = response_1['choices'][0]['message']['function_call']
function_arguments = json.loads(function_call['arguments'])
# We use the WeatherAPI to fetch the current weather for the location specified in the function call
weather_response = requests.get(f"https://api.weatherapi.com/v1/current.json?key={weather_api_key}&q={function_arguments['location']}")
# The response is a JSON object, which we parse to extract the weather data
weather_data = weather_response.json()
# We check if the 'unit' parameter was provided. If not, we default to 'celsius'
unit = function_arguments.get('unit', 'celsius')
# We extract the relevant weather details from the weather data
weather_details = {
"temperature": weather_data['current']['temp_c'] if unit == 'celsius' else weather_data['current']['temp_f'],
"unit": unit,
"description": weather_data['current']['condition']['text']
}
# We call the OpenAI API again, this time providing the assistant with the weather details
response_2 = openai.ChatCompletion.create(
model="gpt-3.5-turbo-0613",
messages=[
{"role": "user", "content": f"What is the weather like in {location}?"},
{"role": "assistant", "content": None, "function_call": {"name": "get_current_weather", "arguments": json.dumps(function_arguments)}},
{"role": "function", "name": "get_current_weather", "content": json.dumps(weather_details)}
],
functions=[function_description]
)
# Finally, we print the assistant's response
print(response_2['choices'][0]['message']['content'])
# This function provides a user-friendly interface for using the weather app
def prompt_user():
# Print welcome message and commands
print("Welcome to the OpenAi Weather app!")
print("Commands:")
print("--location: Get the weather for a specific location")
print("q: Quit the application")
# If the user provided a location argument, fetch weather for that location
if args.location:
fetch_weather(args.location)
# Main application loop
while True:
# Prompt user for location or quit command
location = input("\nEnter the name of the city (or 'q' to quit): ")
# If user enters 'q', break the loop and quit the app
if location.lower() == 'q':
break
else:
# Otherwise, fetch and print the weather for the entered location
fetch_weather(location)
# Call the prompt_user function to start the application
prompt_user()
@Fisseha-Estifanos
Copy link

I see you have implemented the weather information retrieval system. How can we use ChatGPT for multi-turn dialog and chain of thought by holding down (remembering) the context previously given to it?

@donvito
Copy link

donvito commented Jun 21, 2023

I am interested too @Fisseha-Estifanos have you figured it out?

@Fisseha-Estifanos
Copy link

@donvito , not really. I'm still looking for it, I was occupied with other tasks so I have not given it much time. But when I come to think of it, I think we can just use the descriptions of our function and the description of our return parameters and make them go over the chain of thought type of thinking before returning any response to us. But this is just my best guess not that you asked me about it. It is not something I got from a verified source.

@Jaikant
Copy link

Jaikant commented Jun 23, 2023

Yes the only way right now is to orchestrate the function calling, using the state saved in code outside openai.

@nileClearPath
Copy link

Hey Commenting on this thread from the future, not sure if you guys were able to figure it out but you can use Vector Databases to build memory and have it remember things over longer periods.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment