Skip to content

Instantly share code, notes, and snippets.

@nirajpandkar
Created June 26, 2023 04:26
Show Gist options
  • Save nirajpandkar/845cdb23a0d3ea1bafa08fb0c271a9b5 to your computer and use it in GitHub Desktop.
Save nirajpandkar/845cdb23a0d3ea1bafa08fb0c271a9b5 to your computer and use it in GitHub Desktop.
A simple llama index demo to query local files using OpenAI API.
# Import necessary packages
import os
import pickle
from llama_index import GPTSimpleVectorIndex, download_loader
os.environ['OPENAI_API_KEY'] = '<openai_api_key>'
# Load custom data source. Here it is being loaded from the data directory.
SimpleDirectoryReader = download_loader("SimpleDirectoryReader")
loader = SimpleDirectoryReader('./data', recursive=True, exclude_hidden=True)
documents = loader.load_data()
# Create an index of the documents in the data directory
index = GPTSimpleVectorIndex.from_documents(documents)
# Querying the index for relevant answers
while True:
prompt = input("Type prompt...")
response = index.query(prompt)
print(response)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment