Visium: Building a RAG System Locally with Ollama, LlamaIndex, and Chroma DB
Registration
Past Event
About Event
This workshop will introduce how Retrieval-Augmented Generation (RAG) works and how to set up a RAG system on your own device using Ollama, LlamaIndex, and Chroma DB.
You’ll explore how RAG improves AI-generated responses by retrieving relevant information from a vector database.
We’ll guide you through installing and configuring the necessary tools and demonstrate how to store and query your data.
By the end, you’ll be able to efficiently retrieve and generate answers based on your local documents!
If you don’t have a device, we encourage you to go through the setup in groups.
IMPORTANT: Basic experience with Python and Jupyter Notebook is necessary in order to attend this workshop.