Skip to content

Instantly share code, notes, and snippets.

@samkeen
Created July 21, 2023 21:10
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save samkeen/be1cf94df883a74aa203cbb1693ebe6d to your computer and use it in GitHub Desktop.
Save samkeen/be1cf94df883a74aa203cbb1693ebe6d to your computer and use it in GitHub Desktop.
# Supply a custom prompt to a LangChain QA Chain
system_template = """Use the following pieces of context to answer the question at the end. If you don't know the answer, just say that you don't know, don't try to make up an answer.
{context}
Question: {question}
Helpful Answer:"""
messages = [
SystemMessagePromptTemplate.from_template(system_template),
HumanMessagePromptTemplate.from_template("{question}"),
]
chat_prompt = ChatPromptTemplate.from_messages(messages)
# Instantiate a Question/Answer Langchain chain.
chain = load_qa_chain(llm, chain_type="stuff", verbose=True, prompt=chat_prompt)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment