Skip to content

Instantly share code, notes, and snippets.

@heathermiller
Created January 11, 2024 18:57
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save heathermiller/64753812794fe8a2f0c9f5957dcd2eb3 to your computer and use it in GitHub Desktop.
Save heathermiller/64753812794fe8a2f0c9f5957dcd2eb3 to your computer and use it in GitHub Desktop.
LCEL
# the prompt is pulled from the LangSmith Hub which hosts many different prompts
prompt = hub.pull("hwchase17/self-ask-with-search")
llm = OpenAI(temperature=0)
# provide the LM with useful tools
search = SerpAPIWrapper()
tools = Tool( name="Intermediate Answer", func=search.run, description="useful for when you need to ask with search")]
llm_with_stop = llm.bind(stop=["\nIntermediate answer:"])
agent = (
{
"input": lambda x: x["input"],
# Use some custom observation_prefix/llm_prefix for formatting
"agent_scratchpad": lambda x: format_log_to_str( x["intermediate_steps"],
observation_prefix="\nIntermediate answer: ",llm_prefix="",), }
| prompt
| llm_with_stop
| SelfAskOutputParser())
agent_executor.invoke({"input": "What is the hometown of the reigning men's U.S. Open champion?"})
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment