>SYSTEM_PROCESSING

A simple Retrieval-Augmented Generation (RAG) app. It reads a PDF and answers your questions using a vector store and an LLM.
data/story.pdf into chunks. 2) Embed and store in a local Chroma DB. 3) Retrieve similar chunks for your question. 4) Ask the LLM to answer using that context.Prereqs: Python 3.13+, uv installed.
.env with your key:
GOOGLE_API_KEY=your-google-api-key
data/story.pdf (or change the path in src/main.py).uv sync
Ask a question from the terminal:
uv run -m src.main "What is SHE doing in India?"
chroma_langchain_db/.src/main.py, src/vectorstore.py, src/rag.py, src/config.py.