Skip to content

Instantly share code, notes, and snippets.

@ruvnet
Last active April 9, 2024 23:05
Show Gist options
  • Save ruvnet/c9331b8ce93c4fcf1bbeec864b51a938 to your computer and use it in GitHub Desktop.
Save ruvnet/c9331b8ce93c4fcf1bbeec864b51a938 to your computer and use it in GitHub Desktop.
c9331b8ce93c4fcf1bbeec864b51a938
Display the source blob
Display the rendered blob
Raw
{
"cells": [
{
"cell_type": "markdown",
"metadata": {
"id": "view-in-github",
"colab_type": "text"
},
"source": [
"<a href=\"https://colab.research.google.com/gist/ruvnet/c9331b8ce93c4fcf1bbeec864b51a938/notebook.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
"cell_type": "markdown",
"id": "0c3fad6a",
"metadata": {
"id": "0c3fad6a"
},
"source": [
"# Google Generative AI Python Client Tutorial\n",
"\n",
"This Jupyter notebook provides a step-by-step guide on how to use the Google Generative AI Python client to access powerful generative AI models. You'll learn how to install the client, configure authentication, make API requests to generate text and images, and best practices for prompt engineering.\n",
"\n",
"## Introduction\n",
"\n",
"Google's Generative AI platform allows developers to easily integrate state-of-the-art generative models into their applications. With the Python client, you can access these capabilities with just a few lines of code. Some key benefits:\n",
"\n",
"- Access a variety of pre-trained models for text, image, and audio generation\n",
"- Fine-tune models on your own data for custom applications\n",
"- Scale seamlessly with Google's infrastructure\n",
"- Responsible AI tools for safety and transparency\n",
"\n",
"Whether you're building a text summarization tool, an image editing app, or an AI assistant, the Google Generative AI client makes it easy to leverage the power of generative AI. Let's get started!\n",
"\n",
"## Setup and Installation\n",
"\n",
"### Prerequisites\n",
"\n",
"Before you can use the Google Generative AI client, you'll need to:\n",
"\n",
"1. Set up a Google Cloud project and enable the Vertex AI API\n",
"2. Install the Google Cloud SDK and authenticate your account\n",
"3. Create a service account with the necessary permissions\n",
"\n",
"Refer to the [Vertex AI documentation](https://cloud.google.com/vertex-ai/docs/start/client-libraries) for detailed instructions.\n",
"\n",
"### Installing the Client Library\n",
"\n",
"Once your Google Cloud environment is set up, install the Generative AI client library with pip:"
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "d60d937b",
"metadata": {
"id": "d60d937b",
"outputId": "6c674fd3-b7fb-49f4-cd69-4f364c4a8322",
"colab": {
"base_uri": "https://localhost:8080/"
}
},
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"Requirement already satisfied: google-generativeai in /usr/local/lib/python3.10/dist-packages (0.3.2)\n",
"Requirement already satisfied: google-ai-generativelanguage==0.4.0 in /usr/local/lib/python3.10/dist-packages (from google-generativeai) (0.4.0)\n",
"Requirement already satisfied: google-auth in /usr/local/lib/python3.10/dist-packages (from google-generativeai) (2.27.0)\n",
"Requirement already satisfied: google-api-core in /usr/local/lib/python3.10/dist-packages (from google-generativeai) (2.11.1)\n",
"Requirement already satisfied: typing-extensions in /usr/local/lib/python3.10/dist-packages (from google-generativeai) (4.10.0)\n",
"Requirement already satisfied: protobuf in /usr/local/lib/python3.10/dist-packages (from google-generativeai) (3.20.3)\n",
"Requirement already satisfied: tqdm in /usr/local/lib/python3.10/dist-packages (from google-generativeai) (4.66.2)\n",
"Requirement already satisfied: proto-plus<2.0.0dev,>=1.22.3 in /usr/local/lib/python3.10/dist-packages (from google-ai-generativelanguage==0.4.0->google-generativeai) (1.23.0)\n",
"Requirement already satisfied: googleapis-common-protos<2.0.dev0,>=1.56.2 in /usr/local/lib/python3.10/dist-packages (from google-api-core->google-generativeai) (1.63.0)\n",
"Requirement already satisfied: requests<3.0.0.dev0,>=2.18.0 in /usr/local/lib/python3.10/dist-packages (from google-api-core->google-generativeai) (2.31.0)\n",
"Requirement already satisfied: cachetools<6.0,>=2.0.0 in /usr/local/lib/python3.10/dist-packages (from google-auth->google-generativeai) (5.3.3)\n",
"Requirement already satisfied: pyasn1-modules>=0.2.1 in /usr/local/lib/python3.10/dist-packages (from google-auth->google-generativeai) (0.4.0)\n",
"Requirement already satisfied: rsa<5,>=3.1.4 in /usr/local/lib/python3.10/dist-packages (from google-auth->google-generativeai) (4.9)\n",
"Requirement already satisfied: grpcio<2.0dev,>=1.33.2 in /usr/local/lib/python3.10/dist-packages (from google-api-core->google-generativeai) (1.62.1)\n",
"Requirement already satisfied: grpcio-status<2.0.dev0,>=1.33.2 in /usr/local/lib/python3.10/dist-packages (from google-api-core->google-generativeai) (1.48.2)\n",
"Requirement already satisfied: pyasn1<0.7.0,>=0.4.6 in /usr/local/lib/python3.10/dist-packages (from pyasn1-modules>=0.2.1->google-auth->google-generativeai) (0.6.0)\n",
"Requirement already satisfied: charset-normalizer<4,>=2 in /usr/local/lib/python3.10/dist-packages (from requests<3.0.0.dev0,>=2.18.0->google-api-core->google-generativeai) (3.3.2)\n",
"Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.10/dist-packages (from requests<3.0.0.dev0,>=2.18.0->google-api-core->google-generativeai) (3.6)\n",
"Requirement already satisfied: urllib3<3,>=1.21.1 in /usr/local/lib/python3.10/dist-packages (from requests<3.0.0.dev0,>=2.18.0->google-api-core->google-generativeai) (2.0.7)\n",
"Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.10/dist-packages (from requests<3.0.0.dev0,>=2.18.0->google-api-core->google-generativeai) (2024.2.2)\n"
]
}
],
"source": [
"!pip install google-generativeai"
]
},
{
"cell_type": "markdown",
"source": [
"## 🦄 Install LiteLLM for Generative Ai super powers."
],
"metadata": {
"id": "zRam5KvwDArm"
},
"id": "zRam5KvwDArm"
},
{
"cell_type": "code",
"source": [
"import os\n",
"import pkg_resources\n",
"\n",
"def is_package_installed(package_name):\n",
" try:\n",
" pkg_resources.get_distribution(package_name)\n",
" return True\n",
" except pkg_resources.DistributionNotFound:\n",
" return False\n",
"\n",
"def install_package(package_name):\n",
" os.system(f\"pip install {package_name}\")\n",
"\n",
"# Check if litellm is installed\n",
"if not is_package_installed(\"litellm\"):\n",
" print(\"litellm package not found. Installing...\")\n",
" install_package(\"litellm\")\n",
"else:\n",
" print(\"litellm package is already installed.\")\n",
"\n",
"# Import litellm after ensuring it's installed\n",
"from litellm import completion\n",
"\n",
"# Rest of your code...\n",
"print(\"All required packages are installed and ready to use!\")"
],
"metadata": {
"id": "wZwShGfnCZcf",
"outputId": "e2b5caa0-be9a-4f78-fe31-33a01fcb70d7",
"colab": {
"base_uri": "https://localhost:8080/"
}
},
"id": "wZwShGfnCZcf",
"execution_count": 4,
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"litellm package not found. Installing...\n",
"All required packages are installed and ready to use!\n"
]
}
]
},
{
"cell_type": "markdown",
"source": [
"## Set Gemni API Key"
],
"metadata": {
"id": "YGnc3g7dC7GV"
},
"id": "YGnc3g7dC7GV"
},
{
"cell_type": "code",
"execution_count": 7,
"id": "675bbf17",
"metadata": {
"id": "675bbf17",
"outputId": "939899fa-e717-496e-e6cd-769ac9a93355",
"colab": {
"base_uri": "https://localhost:8080/"
}
},
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"✅ GEMINI_API_KEY is set as an environment variable. Ready to proceed!\n"
]
}
],
"source": [
"import os\n",
"import openai\n",
"from google.colab import userdata\n",
"\n",
"# Set the OpenAI API key using the value stored in Colab secrets\n",
"api_key = userdata.get('GEMINI_API_KEY')\n",
"\n",
"# Set the environment variable\n",
"os.environ['GEMINI_API_KEY'] = api_key\n",
"\n",
"# Verify the API key is set\n",
"if 'GEMINI_API_KEY' in os.environ:\n",
" print(\"✅ GEMINI_API_KEY is set as an environment variable. Ready to proceed!\")\n",
"else:\n",
" print(\"⚠️GEMINI_API_KEY is not set as an environment variable. Please check your setup.\")"
]
},
{
"cell_type": "markdown",
"id": "1d4102b5",
"metadata": {
"id": "1d4102b5"
},
"source": [
"## Text Generation\n",
"\n",
"### Instantiating the Model\n",
"\n",
"To generate text, we'll use the `genai.GenerativeModel` class. First import the library and set your API key:"
]
},
{
"cell_type": "code",
"execution_count": 8,
"id": "e8e7049f",
"metadata": {
"id": "e8e7049f"
},
"outputs": [],
"source": [
"import google.generativeai as genai\n",
"import os\n",
"\n",
"genai.configure(api_key=os.environ['GEMINI_API_KEY'])"
]
},
{
"cell_type": "markdown",
"source": [
"## Choose Model"
],
"metadata": {
"id": "yLlefOVHEzT5"
},
"id": "yLlefOVHEzT5"
},
{
"cell_type": "code",
"source": [
"pip install gradio"
],
"metadata": {
"id": "ZR2ubbfWFuXt"
},
"id": "ZR2ubbfWFuXt",
"execution_count": null,
"outputs": []
},
{
"cell_type": "code",
"source": [
"import google.generativeai as genai\n",
"import pprint\n",
"\n",
"for model in genai.list_models():\n",
" pprint.pprint(model)"
],
"metadata": {
"id": "ewYdXZ2AEzE0"
},
"id": "ewYdXZ2AEzE0",
"execution_count": null,
"outputs": []
},
{
"cell_type": "markdown",
"id": "a2a3ee89",
"metadata": {
"id": "a2a3ee89"
},
"source": [
"Now we can instantiate a model. Let's use the `text-bison` model which is great for open-ended generation:"
]
},
{
"cell_type": "code",
"execution_count": 29,
"id": "aba23c3b",
"metadata": {
"id": "aba23c3b"
},
"outputs": [],
"source": [
"model = genai.GenerativeModel('gemini-pro')"
]
},
{
"cell_type": "markdown",
"id": "a40a53bf",
"metadata": {
"id": "a40a53bf"
},
"source": [
"### Basic Prompting\n",
"\n",
"We can now generate text by providing a prompt to the model's `generate_content` method:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "c2c3a6a1",
"metadata": {
"id": "c2c3a6a1"
},
"outputs": [],
"source": [
"prompt = \"\"\"Write a haiku about a cat sleeping in the sun.\"\"\"\n",
"\n",
"response = model.generate_content(prompt)\n",
"\n",
"print(response.text)"
]
},
{
"cell_type": "markdown",
"id": "5c7d5b9f",
"metadata": {
"id": "5c7d5b9f"
},
"source": [
"The `generate_content` method returns a `GenerateContentResponse` object. The generated text is accessible via the `text` attribute.\n",
"\n",
"### Configuring Generation Parameters\n",
"\n",
"We can control the generated text by specifying additional parameters:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "e84dc0c5",
"metadata": {
"id": "e84dc0c5"
},
"outputs": [],
"source": [
"response = model.generate_content(\n",
" prompt,\n",
" temperature=0.7,\n",
" max_output_tokens=100,\n",
" top_p=0.9,\n",
" top_k=40\n",
")\n",
"\n",
"print(response.text)"
]
},
{
"cell_type": "markdown",
"id": "8ff2b528",
"metadata": {
"id": "8ff2b528"
},
"source": [
"- `temperature`: Controls randomness. Lower values produce more focused, deterministic output.\n",
"- `max_output_tokens`: The maximum number of tokens to generate.\n",
"- `top_p`: Nucleus sampling. Set to 1.0 to disable.\n",
"- `top_k`: Limits sampling to the top K most likely tokens.\n",
"\n",
"Experiment with different settings to get a feel for how they impact the generated text.\n",
"\n",
"### Prompt Engineering\n",
"\n",
"The quality of your generated text depends heavily on the prompt you provide. Some tips:\n",
"\n",
"- Be specific. The more context you give the model, the better it can generate relevant output.\n",
"- Use delimiters. Surround your prompt with unique identifiers like XML tags to help the model understand the prompt's structure.\n",
"- Give examples. Provide examples of the desired output format to guide the model.\n",
"- Experiment and iterate. Try different prompts and refine based on the results.\n",
"\n",
"Here's an example of a well-structured prompt for generating product names:"
]
},
{
"cell_type": "code",
"execution_count": 13,
"id": "5d3d2281",
"metadata": {
"id": "5d3d2281",
"outputId": "30387b73-1285-48c8-cd7f-7ec1ae57dc22",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 35
}
},
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"VigilantEye X2\n"
]
}
],
"source": [
"prompt = \"\"\"<product>\n",
"A home security camera with night vision and facial recognition.\n",
"</product>\n",
"\n",
"<product_name_examples>\n",
"GuardianEye Pro\n",
"SentryVision Elite\n",
"HomeWatch AI\n",
"</product_name_examples>\n",
"\n",
"<product_name>\"\"\"\n",
"\n",
"response = model.generate_content(prompt)\n",
"\n",
"print(response.text)"
]
},
{
"cell_type": "markdown",
"id": "8f2c8c9e",
"metadata": {
"id": "8f2c8c9e"
},
"source": [
"## Image Generation\n",
"\n",
"The Google Generative AI client also supports image generation using models like Imagen. The process is similar to text generation:\n",
"\n",
"1. Instantiate an image model\n",
"2. Provide a text prompt describing the desired image\n",
"3. Specify output parameters like size and format\n",
"4. Generate the image\n",
"\n",
"Here's an example:"
]
},
{
"cell_type": "code",
"execution_count": 14,
"id": "c66d7a48",
"metadata": {
"id": "c66d7a48",
"outputId": "a19d6665-56ab-4c70-b8dd-486d2045f5af",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 375
}
},
"outputs": [
{
"output_type": "error",
"ename": "ValueError",
"evalue": "Unknown field for GenerateContentRequest: height",
"traceback": [
"\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
"\u001b[0;31mValueError\u001b[0m Traceback (most recent call last)",
"\u001b[0;32m<ipython-input-14-237818af2521>\u001b[0m in \u001b[0;36m<cell line: 5>\u001b[0;34m()\u001b[0m\n\u001b[1;32m 3\u001b[0m \u001b[0mprompt\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m\"A cute corgi puppy wearing sunglasses at the beach\"\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 4\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 5\u001b[0;31m response = imagen.generate_content(\n\u001b[0m\u001b[1;32m 6\u001b[0m \u001b[0mprompt\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 7\u001b[0m \u001b[0mheight\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;36m512\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
"\u001b[0;32m/usr/local/lib/python3.10/dist-packages/google/generativeai/generative_models.py\u001b[0m in \u001b[0;36mgenerate_content\u001b[0;34m(self, contents, generation_config, safety_settings, stream, **kwargs)\u001b[0m\n\u001b[1;32m 232\u001b[0m \u001b[0;34m**\u001b[0m\u001b[0mkwargs\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 233\u001b[0m ) -> generation_types.GenerateContentResponse:\n\u001b[0;32m--> 234\u001b[0;31m request = self._prepare_request(\n\u001b[0m\u001b[1;32m 235\u001b[0m \u001b[0mcontents\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mcontents\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 236\u001b[0m \u001b[0mgeneration_config\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mgeneration_config\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
"\u001b[0;32m/usr/local/lib/python3.10/dist-packages/google/generativeai/generative_models.py\u001b[0m in \u001b[0;36m_prepare_request\u001b[0;34m(self, contents, generation_config, safety_settings, **kwargs)\u001b[0m\n\u001b[1;32m 213\u001b[0m \u001b[0mmerged_ss\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0msafety_types\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mnormalize_safety_settings\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mmerged_ss\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mharm_category_set\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m\"new\"\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 214\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 215\u001b[0;31m return glm.GenerateContentRequest(\n\u001b[0m\u001b[1;32m 216\u001b[0m \u001b[0mmodel\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_model_name\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 217\u001b[0m \u001b[0mcontents\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mcontents\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
"\u001b[0;32m/usr/local/lib/python3.10/dist-packages/proto/message.py\u001b[0m in \u001b[0;36m__init__\u001b[0;34m(self, mapping, ignore_unknown_fields, **kwargs)\u001b[0m\n\u001b[1;32m 574\u001b[0m \u001b[0;32mcontinue\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 575\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 576\u001b[0;31m raise ValueError(\n\u001b[0m\u001b[1;32m 577\u001b[0m \u001b[0;34m\"Unknown field for {}: {}\"\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mformat\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__class__\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m__name__\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mkey\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 578\u001b[0m )\n",
"\u001b[0;31mValueError\u001b[0m: Unknown field for GenerateContentRequest: height"
]
}
],
"source": [
"imagen = genai.GenerativeModel('imagen')\n",
"\n",
"prompt = \"A cute corgi puppy wearing sunglasses at the beach\"\n",
"\n",
"response = imagen.generate_content(\n",
" prompt,\n",
" height=512,\n",
" width=512,\n",
" num_images=1,\n",
" format='PNG'\n",
")\n",
"\n",
"with open('corgi.png', 'wb') as f:\n",
" f.write(response.image)\n",
"\n",
"print(\"Image saved as corgi.png\")"
]
},
{
"cell_type": "markdown",
"id": "4f2c8d33",
"metadata": {
"id": "4f2c8d33"
},
"source": [
"The generated image bytes are returned in the `image` attribute of the response object. We can write these bytes directly to a file to save the generated image.\n",
"\n",
"Prompt engineering is just as important for image generation as it is for text. Provide as much detail as possible about the desired contents, style, composition, etc.\n",
"\n",
"## Next Steps\n",
"\n",
"This notebook covered the basics of using the Google Generative AI Python client for text and image generation. Some next steps to explore:\n",
"\n",
"- Experiment with different model types like `chat-bison` for conversational AI\n",
"- Fine-tune models on your own data for custom applications\n",
"- Explore the safety and transparency tools for analyzing and filtering generated content\n",
"- Build a complete application that integrates generative AI capabilities\n",
"\n",
"Refer to the [Google Generative AI documentation](https://ai.google.dev/api/python/google/generativeai) for the complete API reference and more advanced guides. You can also find pre-built solutions and interactive tools in [Google AI Studio](https://cloud.google.com/ai-studio).\n",
"\n",
"Go build something amazing with generative AI!"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.8"
},
"colab": {
"provenance": [],
"include_colab_link": true
}
},
"nbformat": 4,
"nbformat_minor": 5
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment