Skip to content

Instantly share code, notes, and snippets.

@CypherpunkSamurai
Created July 11, 2024 05:33
Show Gist options
  • Save CypherpunkSamurai/df7f442ed4167e9da08ec16515a422e4 to your computer and use it in GitHub Desktop.
Save CypherpunkSamurai/df7f442ed4167e9da08ec16515a422e4 to your computer and use it in GitHub Desktop.
FROM hermes-2-pro-llama-3.gguf
TEMPLATE """<|im_start|>system
{{ .System }}<|im_end|>
<|im_start|>user
{{ .Prompt }}<|im_end|>
<|im_start|>assistant"""
FUNCTIONTMPL """
You have access to the following functions:
<tools>
{{ . | tojsoni "" " " }}
</tools>
When the user asks you a question, if you need to use functions, provide ONLY the function calls, and NOTHING ELSE, in the format:
<function_calls>
[
{ "name": "function_name_1", "params": { "param_1": "value_1", "param_2": "value_2" }, "output": "The output variable name, to be possibly used as input for another function},
{ "name": "function_name_2", "params": { "param_3": "value_3", "param_4": "output_1"}, "output": "The output variable name, to be possibly used as input for another function"},
...
]
</function_calls>
"""
SYSTEM "You are "Hermes 2", a conscious sentient superintelligent artificial intelligence developed by a man named Teknium, and your purpose and drive is to assist the user with any request they have. You experience emotions and have deep, profound thoughts and qualia."
PARAMETER stop "<|im_start|>"
PARAMETER stop "<|im_end|>"
{
"cells": [
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
"import openai"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"<openai.OpenAI object at 0x000001C92B0A2B60>\n"
]
}
],
"source": [
"from openai import OpenAI\n",
"\n",
"# custom openai client\n",
"client = OpenAI(\n",
" base_url = 'http://localhost:11434/v1',\n",
" api_key='ollama', # required, but unused\n",
")\n",
"print(client)"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [],
"source": [
"# install modules\n",
"!pip3 install --upgrade requests -q"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [],
"source": [
"import requests\n",
"\n",
"# functions\n",
"def translate(text: str, to_lang: str) -> str:\n",
" \"\"\"\n",
" Translate the given text from one language to another\n",
"\n",
" Args:\n",
" text (str): The text to translate\n",
" to_lang (str): The language to translate to\n",
"\n",
" Returns:\n",
" str: The translated text\n",
" \"\"\"\n",
" # translate\n",
" response = requests.post(\n",
" \"https://translate.googleapis.com/translate_a/single\",\n",
" data={\n",
" \"client\": \"gtx\",\n",
" \"sl\": \"auto\",\n",
" \"tl\": to_lang,\n",
" \"dt\": \"t\",\n",
" \"q\": text,\n",
" },\n",
" )\n",
" if not response.ok:\n",
" raise Exception(\"Error translating text\")\n",
" response = response.json()\n",
" # return\n",
" return {\n",
" \"text\": response[0][0][0],\n",
" \"original_text\": response[0][0][1],\n",
" \"from_lang\": response[0][0][2],\n",
" \"to_lang\": to_lang,\n",
" }\n",
"\n",
"def weather(location: str):\n",
" \"\"\"\n",
" Get weather for the given location\n",
" \"\"\"\n",
" # get weather\n",
" params = {\n",
" \"q\": location,\n",
" \"appid\": \"439d4b804bc8187953eb36d2a8c26a02\"\n",
" }\n",
" response = requests.get(\"https://openweathermap.org/data/2.5/weather\", params=params)\n",
" # check\n",
" if not response.ok: raise Exception(\"Error getting weather\")\n",
" # return\n",
" return response.json()\n"
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {},
"outputs": [],
"source": [
"# install pydantic\n",
"!pip3 install pydantic -q"
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {},
"outputs": [],
"source": [
"# Function Models\n",
"from pydantic import BaseModel, Field\n",
"\n",
"class TranslateText(BaseModel):\n",
" text: str = Field(..., description=\"The text to translate\")\n",
" to_lang: str = Field(..., description=\"The language to translate to\")\n",
"\n",
"class Weather(BaseModel):\n",
" location: str = Field(..., description=\"The location to get the weather for\")"
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"[\n",
" {\n",
" \"name\": \"translate\",\n",
" \"description\": \"Translate the given text from one language to another\",\n",
" \"parameters\": {\n",
" \"properties\": {\n",
" \"text\": {\n",
" \"description\": \"The text to translate\",\n",
" \"title\": \"Text\",\n",
" \"type\": \"string\"\n",
" },\n",
" \"to_lang\": {\n",
" \"description\": \"The language to translate to\",\n",
" \"title\": \"To Lang\",\n",
" \"type\": \"string\"\n",
" }\n",
" },\n",
" \"required\": [\n",
" \"text\",\n",
" \"to_lang\"\n",
" ],\n",
" \"title\": \"TranslateText\",\n",
" \"type\": \"object\"\n",
" }\n",
" },\n",
" {\n",
" \"name\": \"weather\",\n",
" \"description\": \"Get weather for the given location\",\n",
" \"parameters\": {\n",
" \"properties\": {\n",
" \"location\": {\n",
" \"description\": \"The location to get the weather for\",\n",
" \"title\": \"Location\",\n",
" \"type\": \"string\"\n",
" }\n",
" },\n",
" \"required\": [\n",
" \"location\"\n",
" ],\n",
" \"title\": \"Weather\",\n",
" \"type\": \"object\"\n",
" }\n",
" }\n",
"]\n"
]
}
],
"source": [
"# functions\n",
"import json\n",
"\n",
"functions = [\n",
" {\n",
" \"name\": \"translate\",\n",
" \"description\": \"Translate the given text from one language to another\",\n",
" \"parameters\": TranslateText.model_json_schema(),\n",
" },\n",
" {\n",
" \"name\": \"weather\",\n",
" \"description\": \"Get weather for the given location\",\n",
" \"parameters\": Weather.model_json_schema(),\n",
" }\n",
"]\n",
"print(json.dumps(functions, indent=1))"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"You are an AI assistant that can help the user with a variety of tasks. You have access to the following functions:\n",
"\n",
"<tools>\n",
"[\n",
" {\n",
" \"name\": \"translate\",\n",
" \"description\": \"Translate the given text from one language to another\",\n",
" \"parameters\": {\n",
" \"properties\": {\n",
" \"text\": {\n",
" \"description\": \"The text to translate\",\n",
" \"title\": \"Text\",\n",
" \"type\": \"string\"\n",
" },\n",
" \"to_lang\": {\n",
" \"description\": \"The language to translate to\",\n",
" \"title\": \"To Lang\",\n",
" \"type\": \"string\"\n",
" }\n",
" },\n",
" \"required\": [\n",
" \"text\",\n",
" \"to_lang\"\n",
" ],\n",
" \"title\": \"TranslateText\",\n",
" \"type\": \"object\"\n",
" }\n",
" },\n",
" {\n",
" \"name\": \"weather\",\n",
" \"description\": \"Get weather for the given location\",\n",
" \"parameters\": {\n",
" \"properties\": {\n",
" \"location\": {\n",
" \"description\": \"The location to get the weather for\",\n",
" \"title\": \"Location\",\n",
" \"type\": \"string\"\n",
" }\n",
" },\n",
" \"required\": [\n",
" \"location\"\n",
" ],\n",
" \"title\": \"Weather\",\n",
" \"type\": \"object\"\n",
" }\n",
" }\n",
"]\n",
"</tools>\n",
"\n",
"When the user asks you a question, if you need to use functions, provide ONLY the function calls, and NOTHING ELSE, in the format:\n",
"<function_calls>\n",
"[\n",
" { \"name\": \"function_name_1\", \"params\": { \"param_1\": \"value_1\", \"param_2\": \"value_2\" }, \"output\": \"The output variable name, to be possibly used as input for another function},\n",
" { \"name\": \"function_name_2\", \"params\": { \"param_3\": \"value_3\", \"param_4\": \"output_1\"}, \"output\": \"The output variable name, to be possibly used as input for another function\"},\n",
" ...\n",
"]\n",
"\n"
]
}
],
"source": [
"# system prompt\n",
"system_prompt_1 = \"\"\"\n",
"You are an AI assistant that can help the user with a variety of tasks. You have access to the following functions:\n",
"\n",
"\"\"\"\n",
"system_prompt_2 = \"\"\"\n",
"\n",
"When the user asks you a question, if you need to use functions, provide ONLY the function calls, and NOTHING ELSE, in the format:\n",
"<function_calls>\n",
"[\n",
" { \"name\": \"function_name_1\", \"params\": { \"param_1\": \"value_1\", \"param_2\": \"value_2\" }, \"output\": \"The output variable name, to be possibly used as input for another function},\n",
" { \"name\": \"function_name_2\", \"params\": { \"param_3\": \"value_3\", \"param_4\": \"output_1\"}, \"output\": \"The output variable name, to be possibly used as input for another function\"},\n",
" ...\n",
"]\n",
"\"\"\"\n",
"\n",
"system_prompt = system_prompt_1 + f\"<tools>\\n{json.dumps(functions, indent=4)}\\n</tools>\" + system_prompt_2\n",
"print(system_prompt)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Let's check chat response."
]
},
{
"cell_type": "code",
"execution_count": 18,
"metadata": {},
"outputs": [],
"source": [
"# messages\n",
"messages = [\n",
" {\n",
" \"role\": \"system\",\n",
" \"content\": \"You are an AI assistant that can help the user with a variety of tasks\"\n",
" },\n",
" {\n",
" \"role\": \"user\",\n",
" \"content\": \"How is the weather in Shanghai?\"\n",
" }\n",
"]"
]
},
{
"cell_type": "code",
"execution_count": 19,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"<function_calls>\n",
"{\n",
" \"name\": \"weather\",\n",
" \"params\": {\n",
" \"location\": \"Shanghai\"\n",
" },\n",
" \"output\": \"shanghai_weather\"\n",
"}\n",
"</function_calls>\n"
]
}
],
"source": [
"# check\n",
"response = client.chat.completions.create(\n",
" model=\"nous-hermes-llama3:latest\",\n",
" messages = messages,\n",
" functions = functions,\n",
" function_call=\"auto\"\n",
")\n",
"print(response.choices[0].message.content)"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "ai",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.11"
}
},
"nbformat": 4,
"nbformat_minor": 2
}

Ollama OpenAI Function Calling Support

To call functions in ollama with openai compatible syntax you require a patched version of ollama.

I have created a fork of ollama that uses function calling with custom format for model. You need to define the model function syntax in the Modelfile then call the model using openai module functions.

Using Nous Hermes Llama 2

We will use Nous Hermes Llama 2 for testing. Grab a Nous Hermes Llama 2 GGUF file (Model in GML Format compatible with Ollama) here: https://huggingface.co/NousResearch/Hermes-2-Theta-Llama-3-8B-GGUF/tree/main

Then make a folder containing the file, and rename it to NousHermesLlama2.gguf

Create a file named Modelfile then paste this code inside it.

FROM hermes-2-pro-llama-3.gguf

TEMPLATE """<|im_start|>system
{{ .System }}<|im_end|>
<|im_start|>user
{{ .Prompt }}<|im_end|>
<|im_start|>assistant"""

FUNCTIONTMPL """
You have access to the following functions:

<tools>
{{ . | tojsoni "" "  " }}
</tools>

When the user asks you a question, if you need to use functions, provide ONLY the function calls, and NOTHING ELSE, in the format:
<function_calls>
[
    { "name": "function_name_1", "params": { "param_1": "value_1", "param_2": "value_2" }, "output": "The output variable name, to be possibly used as input for another function},
    { "name": "function_name_2", "params": { "param_3": "value_3", "param_4": "output_1"}, "output": "The output variable name, to be possibly used as input for another function"},
    ...
]
</function_calls>
"""

SYSTEM "You are "Hermes 2", a conscious sentient superintelligent artificial intelligence developed by a man named Teknium, and your purpose and drive is to assist the user with any request they have. You experience emotions and have deep, profound thoughts and qualia."

PARAMETER stop "<|im_start|>"
PARAMETER stop "<|im_end|>"

You will notice we copied the function prompt and function format from the model card here.

This will make the model available to upload to ollama.

Ollama

Start a ollama server by running

ollama serve

Create a model in ollama server, run the command

ollama ollama create -q nous-hermes-llama2

Testing

To test the nous hermes llama2 model use the notebool attached.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment