Skip to content

Instantly share code, notes, and snippets.

View morteymike's full-sized avatar

Mike Morton morteymike

View GitHub Profile
@morteymike
morteymike / README.md
Created June 16, 2023 13:36
LangChain Streaming using Python Generators

LangChain Streaming Generator

Background

For most chat applications, we want to stream each token back to the client. LangChain's callback support is fantastic for async Web Sockets via FastAPI, and supports this out of the box.

However, developers migrating from OpenAI's python library may find difficulty in implementing a Python generator along the same lines of the OpenAI library approach.

OpenAI Streaming Example

Here's an example of the OpenAI library streaming generator, from the OpenAI Cookbook