Skip to content

Instantly share code, notes, and snippets.

@mneedham
Last active March 21, 2024 14:32
Show Gist options
  • Save mneedham/c95b0765250609311845816a6e5779dc to your computer and use it in GitHub Desktop.
Save mneedham/c95b0765250609311845816a6e5779dc to your computer and use it in GitHub Desktop.
LangChain Example
import { ClickHouseStore } from "@langchain/community/vectorstores/clickhouse";
import { createRetrievalChain } from "langchain/chains/retrieval";
import { OpenAI, OpenAIEmbeddings } from "@langchain/openai";
import { ChatPromptTemplate } from "@langchain/core/prompts";
import { createStuffDocumentsChain } from "langchain/chains/combine_documents";
import { RecursiveCharacterTextSplitter } from "langchain/text_splitter";
import { Document } from '@langchain/core/documents'
const openAIApiKey = "sk-xxx"
const vectorStore = await ClickHouseStore.fromExistingIndex(
new OpenAIEmbeddings({ openAIApiKey }),
{
host: "localhost",
port: 8123,
protocol: "http://",
indexType: "hypothesis"
}
);
import { CheerioWebBaseLoader } from "langchain/document_loaders/web/cheerio";
const loader = new CheerioWebBaseLoader(
"https://docs.smith.langchain.com/user_guide"
);
const docs = await loader.load();
const splitter = new RecursiveCharacterTextSplitter();
const splitDocs = await splitter.splitDocuments(
docs
);
vectorStore.addDocuments(
splitDocs
)
import { ClickHouseStore } from "@langchain/community/vectorstores/clickhouse";
import { createRetrievalChain } from "langchain/chains/retrieval";
import { OpenAI, OpenAIEmbeddings } from "@langchain/openai";
import { ChatPromptTemplate } from "@langchain/core/prompts";
import { createStuffDocumentsChain } from "langchain/chains/combine_documents";
const openAIApiKey = "sk-xxx"
const vectorStore = await ClickHouseStore.fromExistingIndex(
new OpenAIEmbeddings({ openAIApiKey }),
{
host: "localhost",
port: 8123,
protocol: "http://",
indexType: "hypothesis"
}
);
const model = new OpenAI({ temperature: 0, openAIApiKey: openAIApiKey, modelName: "gpt-3.5-turbo-0125" });
const retriever = vectorStore.asRetriever()
const prompt =
ChatPromptTemplate.fromTemplate(`Answer the following question based only on the provided context:
<context>
{context}
</context>
Question: {input}`);
const documentChain = await createStuffDocumentsChain({
llm: model,
prompt,
});
const retrievalChain = await createRetrievalChain({
retriever: vectorStore.asRetriever(),
combineDocsChain: documentChain
});
const response = await retrievalChain.invoke({ input: "what is LangSmith?" });
console.log(response)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment