Skip to content

Instantly share code, notes, and snippets.

@mattppal
Last active December 20, 2023 20:51
Show Gist options
  • Save mattppal/b347dc4eb16c121e080b83a5ccc5c77a to your computer and use it in GitHub Desktop.
Save mattppal/b347dc4eb16c121e080b83a5ccc5c77a to your computer and use it in GitHub Desktop.
langchain-yt-demo
Display the source blob
Display the rendered blob
Raw
{
"cells": [
{
"cell_type": "code",
"execution_count": 18,
"id": "0b31cddd",
"metadata": {},
"outputs": [],
"source": [
"import os\n",
"from dotenv import dotenv_values\n",
"\n",
"secrets_dir = os.path.expanduser(\"~/Documents/secrets\")\n",
"\n",
"config = {\n",
" **dotenv_values(secrets_dir + \"/.env.openai\"),\n",
" **dotenv_values(secrets_dir + \"/.env.langchain\")\n",
"}\n",
"\n",
"os.environ[\"OPENAI_API_KEY\"] = config.get(\"OPENAI_API_KEY\")\n",
"os.environ[\"LANGCHAIN_API_KEY\"] = config.get(\"LANGCHAIN_API_KEY\")\n",
"os.environ[\"LANGCHAIN_TRACING_V2\"] = \"true\"\n",
"os.environ[\"LANGCHAIN_ENDPOINT\"] = \"https://api.smith.langchain.com\""
]
},
{
"cell_type": "code",
"execution_count": 19,
"id": "6d837531",
"metadata": {},
"outputs": [],
"source": [
"from langchain import hub\n",
"from langchain.chat_models import ChatOpenAI\n",
"\n",
"llm = ChatOpenAI()\n",
"\n",
"# Fetches the latest version of this prompt\n",
"prompt = hub.pull(\"muhsinbashir/youtube-transcript-to-article\")"
]
},
{
"cell_type": "code",
"execution_count": 20,
"id": "db546a16",
"metadata": {},
"outputs": [],
"source": [
"from langchain.schema import StrOutputParser\n",
"\n",
"chain1 = (\n",
" prompt\n",
" | llm\n",
" | StrOutputParser()\n",
")"
]
},
{
"cell_type": "code",
"execution_count": 21,
"id": "b96c4326",
"metadata": {},
"outputs": [],
"source": [
"from langchain.document_loaders import DirectoryLoader\n",
"\n",
"loader = DirectoryLoader(os.path.expanduser(\"~/Documents/yt-transcripts\"), glob=\"**/langchain.txt\")\n",
"docs = loader.load()"
]
},
{
"cell_type": "code",
"execution_count": 22,
"id": "3e652aac",
"metadata": {},
"outputs": [],
"source": [
"out = chain1.batch([{\"transcript\": t} for t in docs])"
]
},
{
"cell_type": "code",
"execution_count": 23,
"id": "f0842830",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Introducing Lang Chain Templates: The Easiest Way to Build Production-Ready LLM Applications\n",
"\n",
"Lang Chain, co-founded by Harrison, is excited to announce the launch of a new feature called Lang Chain Templates. These templates are designed to simplify and expedite the process of building production-ready LLM (Language Model) applications. They serve as reference architectures for various LLM use cases, all presented in a standardized format that seamlessly integrates with Lang Serve. In this article, we will explore how to utilize Lang Chain Templates, discuss their features and benefits, and guide you through the deployment process.\n",
"\n",
"To get started with Lang Chain Templates, the first step is to install the Lang Chain CLI. You can find detailed instructions on the \"Getting Started\" page on the Lang Chain Templates website. Once the CLI is successfully installed, you can create a new app by running the provided command. This will generate a new app named \"my app.\" Upon inspecting the app directory, you will notice two main packages: \"app\" and \"packages.\" The \"app\" folder contains all the necessary logic for running the Lang Serve app, while the \"packages\" folder is where you will add the ready-to-use LLM application templates.\n",
"\n",
"To add a template, execute the \"add\" command followed by the desired template name. For example, let's add the \"pirate speak\" template, which converts any input text into pirate speak using an LLM. The command will prompt you to install the template as a dependency. Confirm the installation by selecting \"yes.\" Additionally, you will be asked if you want to generate route code for these packages. Opting for \"yes\" will generate the required code, which you can then insert into your Lang Serve app's server.py file.\n",
"\n",
"Before proceeding further, it is essential to set up Lang Smith tracing and your Open AI API key. Ensure you have completed this step for a seamless experience. Once everything is set up, you can start up the application by running the \"Lang Train Start\" command. This will initiate the deployment process.\n",
"\n",
"Upon successful deployment, you will gain access to the Lang Chain Templates playground. This interactive playground allows you to explore and test the deployed LLM chain. By visiting the provided link, you can input text and witness the magic of LLM translation in action. For instance, entering \"Hi\" will be translated into the pirate speak phrase, \"Ho there!\" Additionally, Lang Smith integration enables you to view the translation history and make changes effortlessly.\n",
"\n",
"One of the most remarkable aspects of Lang Chain Templates is its flexibility. You can easily modify the templates to suit your specific needs. Instead of merely installing a pre-built LLM chain, the code is copied into a folder within the packages directory. This allows for seamless updates when you make changes to the code within that folder. For example, you can edit the code to translate user input into Italian. Upon saving the changes, the playground will reflect the update, and subsequent translations will be in Italian.\n",
"\n",
"In addition to the playground, Lang Serve also provides comprehensive documentation. This documentation covers various endpoints, such as the invoke endpoint, batch endpoint, stream endpoint, and stream log endpoint. These endpoints allow you to interact with the deployed LLM chain programmatically, outside of the playground. While the playground serves as an internal development tool and a means to showcase the LLM chain to team members, you have the freedom to create your own playground using these production-ready endpoints.\n",
"\n",
"Lastly, Lang Chain Templates offers a wide array of templates to choose from. Lang Chain has collaborated with numerous partners to curate an extensive collection of templates catering to different use cases. Some popular use cases include advanced retrieval techniques, open source models, extraction, and more. The intention is to provide users with a seamless experience of browsing, downloading, remixing, and customizing these templates to fit their specific requirements. With Lang Serve, deploying your customized templates is a breeze.\n",
"\n",
"Give Lang Chain Templates a try and let us know what you think. We welcome your contributions and suggestions for new templates. Lang Chain is committed to making the process of building LLM applications as straightforward and efficient as possible. With Lang Chain Templates, you can save time, enhance productivity, and unlock the full potential of LLM technology. Get started today and revolutionize your language-based applications!\n"
]
}
],
"source": [
"print(out[0])"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "f845ef62",
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "py3-default",
"language": "python",
"name": "pyenv_py3-default"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.0"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment