Skip to content

Instantly share code, notes, and snippets.

@harshsinghal
Last active July 26, 2023 08:37
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save harshsinghal/e1ceea0a2d1223e341e3313397c052e0 to your computer and use it in GitHub Desktop.
Save harshsinghal/e1ceea0a2d1223e341e3313397c052e0 to your computer and use it in GitHub Desktop.
Accompanying code to go with a series of posts on Retrieval Augmented Generation posts on LinkedIn by Harsh Singhal https://www.linkedin.com/in/harshsinghal/
# ___
# / ()\\
# _|_____|_
# | | === | |
# |_| O |_|
# || O ||
# ||__*__||
# |~ \\___/ ~|
# /=\\ /=\\ /=\\
#______[_]_[_]_[_]_______
import argparse
from llama_index import (
GPTVectorStoreIndex,
SimpleDirectoryReader,
ServiceContext,
StorageContext,
LLMPredictor,
load_index_from_storage,
)
from langchain.chat_models import ChatOpenAI
import openai
import os
def main():
# Initialize the argument parser
parser = argparse.ArgumentParser(description='Query the LLM with a given question.')
# Add the question argument to the parser. This will allow users to input their question as a command-line argument.
parser.add_argument('question', type=str, help='The question you want to ask the LLM.')
# Parse the arguments provided by the user
args = parser.parse_args()
# Set the OpenAI API key for authentication.
# NOTE: Ensure to keep your API keys private and secure. Avoid hardcoding them in scripts.
os.environ['OPENAI_API_KEY'] = 'YOUR_API_KEY'
openai.api_key = 'YOUR_API_KEY'
# Define paths for the documents and index. Adjust these paths according to your directory structure.
documents_folder = "./aws_case_documents/"
index_name = "./aws_case_documents_index"
# Initialize the LLM predictor with the desired model and temperature setting.
llm_predictor = LLMPredictor(llm=ChatOpenAI(model_name="gpt-3.5-turbo", temperature=0))
# Setup the service context using the initialized LLM predictor.
service_context = ServiceContext.from_defaults(llm_predictor=llm_predictor)
# Define the storage context, pointing it to the directory where the index is stored.
storage_context = StorageContext.from_defaults(persist_dir=index_name)
# Load the previously built index from storage. This index will be used to query the documents.
index = load_index_from_storage(storage_context=storage_context, service_context=service_context)
# Use the index's query engine to query the provided question.
response = index.as_query_engine().query(args.question)
# Print the response to the console.
print(response)
# This conditional checks if the script is being run as a standalone file and not being imported as a module.
if __name__ == '__main__':
main()
@harshsinghal
Copy link
Author

  • Replace 'YOUR_API_KEY' with their actual OpenAI API key.
  • Install the required libraries/packages.
  • Adjust the documents_folder and index_name paths according to their directory structure.

Run the script using:

python script_name.py "Your question here"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment