Skip to content

Instantly share code, notes, and snippets.

@makispl
Created December 16, 2023 14:17
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save makispl/0c8bfe776355356f6a5ea6d43fc91f7d to your computer and use it in GitHub Desktop.
Save makispl/0c8bfe776355356f6a5ea6d43fc91f7d to your computer and use it in GitHub Desktop.
# Set the API Key, either by setting the env var or editing it directly here:
openai_api_key = [YOUR_OPENAI_KEY]
# Create a new prompter using any desired model (GPT-3.5)and add the query_results
prompt_text = "Summarize the criticality provisions"
print (f"\n > Prompting LLM with '{prompt_text}'")
prompter = Prompt().load_model("gpt-3.5-turbo", api_key=openai_api_key)
sources = prompter.add_source_query_results(query_res)
# Prompt the LLM with the sources and query string
responses = prompter.prompt_with_source(prompt_text, prompt_name="summarize_with_bullets")
for response in responses:
print ("\n > LLM response\n" + response["llm_response"])
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment