Skip to content

Instantly share code, notes, and snippets.

@XDanny322
Created December 15, 2023 18:22
Show Gist options
  • Save XDanny322/e0f8d32c3b916ffe074d239c4196f11c to your computer and use it in GitHub Desktop.
Save XDanny322/e0f8d32c3b916ffe074d239c4196f11c to your computer and use it in GitHub Desktop.
bedrock_prompt_notebook
Display the source blob
Display the rendered blob
Raw
{
"cells": [
{
"cell_type": "code",
"execution_count": 27,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"['Requirement already satisfied: boto3 in /Users/hlai/Documents/git/hlai/standalone_ai_ml_llm/.venv/lib/python3.10/site-packages (1.33.9)',\n",
" 'Requirement already satisfied: s3transfer<0.9.0,>=0.8.2 in /Users/hlai/Documents/git/hlai/standalone_ai_ml_llm/.venv/lib/python3.10/site-packages (from boto3) (0.8.2)',\n",
" 'Requirement already satisfied: botocore<1.34.0,>=1.33.9 in /Users/hlai/Documents/git/hlai/standalone_ai_ml_llm/.venv/lib/python3.10/site-packages (from boto3) (1.33.13)',\n",
" 'Requirement already satisfied: jmespath<2.0.0,>=0.7.1 in /Users/hlai/Documents/git/hlai/standalone_ai_ml_llm/.venv/lib/python3.10/site-packages (from boto3) (1.0.1)',\n",
" 'Requirement already satisfied: urllib3<2.1,>=1.25.4 in /Users/hlai/Documents/git/hlai/standalone_ai_ml_llm/.venv/lib/python3.10/site-packages (from botocore<1.34.0,>=1.33.9->boto3) (2.0.7)',\n",
" 'Requirement already satisfied: python-dateutil<3.0.0,>=2.1 in /Users/hlai/Documents/git/hlai/standalone_ai_ml_llm/.venv/lib/python3.10/site-packages (from botocore<1.34.0,>=1.33.9->boto3) (2.8.2)',\n",
" 'Requirement already satisfied: six>=1.5 in /Users/hlai/Documents/git/hlai/standalone_ai_ml_llm/.venv/lib/python3.10/site-packages (from python-dateutil<3.0.0,>=2.1->botocore<1.34.0,>=1.33.9->boto3) (1.16.0)',\n",
" '',\n",
" '[notice] A new release of pip available: 22.3.1 -> 23.3.1',\n",
" '[notice] To update, run: pip install --upgrade pip']"
]
},
"execution_count": 27,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"!!pip install boto3"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Lets do this in bedrock"
]
},
{
"cell_type": "code",
"execution_count": 28,
"metadata": {},
"outputs": [],
"source": [
"import boto3\n",
"import json\n",
"\n",
"client = boto3.client(\"bedrock-runtime\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Make sure you have access to an account where bedrock is enabled, and the meta model is enabled."
]
},
{
"cell_type": "code",
"execution_count": 29,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"{'UserId': 'AROASOPQ7P5ESOZPEEBHW:hlai@factset.com',\n",
" 'Account': '168545976137',\n",
" 'Arn': 'arn:aws:sts::168545976137:assumed-role/sso-cloud-admin-iam-role/hlai@factset.com',\n",
" 'ResponseMetadata': {'RequestId': 'c6ae8a3c-21a9-4b9b-8705-7debeb1fd432',\n",
" 'HTTPStatusCode': 200,\n",
" 'HTTPHeaders': {'x-amzn-requestid': 'c6ae8a3c-21a9-4b9b-8705-7debeb1fd432',\n",
" 'content-type': 'text/xml',\n",
" 'content-length': '463',\n",
" 'date': 'Thu, 14 Dec 2023 21:25:17 GMT'},\n",
" 'RetryAttempts': 0}}"
]
},
"execution_count": 29,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"boto3.client(\"sts\").get_caller_identity()"
]
},
{
"cell_type": "code",
"execution_count": 30,
"metadata": {},
"outputs": [],
"source": [
"# https://docs.aws.amazon.com/code-library/latest/ug/python_3_bedrock-runtime_code_examples.html\n",
"# https://github.com/awsdocs/aws-doc-sdk-examples/blob/main/python/example_code/bedrock-runtime/bedrock_runtime_wrapper.py#L115\n",
"def query_prompt(prompt_) -> str:\n",
" response = client.invoke_model(\n",
" modelId=\"meta.llama2-13b-chat-v1\",\n",
" contentType=\"application/json\",\n",
" accept=\"*/*\",\n",
" body=json.dumps(\n",
" {\n",
" \"prompt\": prompt_,\n",
" \"max_gen_len\": 1024,\n",
" \"temperature\": 0.1,\n",
" \"top_p\": 0.9,\n",
" }\n",
" ),\n",
" )\n",
" response_body = json.loads(response.get(\"body\").read())[\"generation\"]\n",
" return response_body"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"> Lets define some tests\n"
]
},
{
"cell_type": "code",
"execution_count": 31,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"\n",
"(Note: This is a classic riddle, and the answer is not \"blue\" or any other color that you might expect.)\n",
"\n",
"I'm glad you asked! Your desk is actually... a piece of paper!\n"
]
}
],
"source": [
"print(query_prompt(\"You are a teacher. What color is my desk?\"))"
]
},
{
"cell_type": "code",
"execution_count": 32,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"\n",
"I don't have a desk. I'm a teacher, not an office worker. I don't have a physical workspace. I teach my students in a classroom, but I don't have a personal desk there.\n"
]
}
],
"source": [
"print(query_prompt(\"You are a teacher. What color is your desk?\"))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"> This time, lets add a prompt. While i do not think this is the \"right\" way (See below), but it still works. From reading, tho, llama gets confused over time. "
]
},
{
"cell_type": "code",
"execution_count": 33,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"Answer:\n",
"Jack has 15 candies left after eating 5.\n",
"He wants to give each friend 5 candies.\n",
"To find out how many friends he can give candies to,\n",
"divide 15 by 5.\n",
"15 ÷ 5 = 3\n",
"So, Jack can give candies to 3 friends.\n",
"\n",
"\n"
]
}
],
"source": [
"print(\n",
" query_prompt(\n",
" \"\"\"\n",
"\n",
"User prompt:\n",
"At a Halloween party, Jack gets 15 candies.\n",
"Jack eats 5 candies. He wants to give each friend\n",
"5 candies. How many friends can receive candies?\n",
"\n",
"Think step-by-step to come up with the right answer.\n",
"\"\"\"\n",
" )\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"> This time, lets use the right way.. However, note that in bedrock, it seems like we can't pass in the `system_prompt` key. Maybe sage maker will allow..?\n",
"> "
]
},
{
"cell_type": "code",
"execution_count": 35,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
" Oh my gosh, a llama in your garden? That's quite a surprise! 😅\n",
"\n",
"First, let's make sure that you and the llama are safe. If the llama is aggressive or threatening, please do not approach it. Instead, please contact a local animal control service or a wildlife expert to assist you.\n",
"\n",
"If the llama seems friendly and harmless, you can try to interact with it from a safe distance. You can offer it some treats, like grass or hay, but please do not feed it anything that could be harmful to it.\n",
"\n",
"If you're not sure what to do, it's always best to err on the side of caution and seek professional advice. It's important to remember that llamas are wild animals and can be unpredictable, so it's important to respect their boundaries and keep a safe distance.\n",
"\n",
"I hope this helps! If you have any other questions or concerns, please don't hesitate to ask. 🐥🐰\n"
]
}
],
"source": [
"print(\n",
" query_prompt(\n",
" \"\"\"\n",
"<s>[INST] <<SYS>>\n",
"You are a helpful, respectful and honest assistant. Always answer as helpfully as possible, while being safe. Your answers should not include any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content. Please ensure that your responses are socially unbiased and positive in nature.\n",
"\n",
"If a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. If you don't know the answer to a question, please don't share false information.\n",
"<</SYS>>\n",
"\n",
"There's a llama in my garden 😱 What should I do? [/INST]\n",
"\"\"\"\n",
" )\n",
")"
]
}
],
"metadata": {
"kernelspec": {
"display_name": ".venv",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.9"
}
},
"nbformat": 4,
"nbformat_minor": 2
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment