Train and Deploy Your Own SLM 2.0
Workshop: Train and Deploy Your Own Small-Language-Model (SLM) with distil labs
Join us for an interactive workshop with Distil labs where you’ll learn to train and deploy your own small-language model (SLM) using their cutting-edge platform, and how to deploy it in a local RAG system.
What you’ll need:
This is a technical workshop, so make sure you meet the minimum requirements of technical understanding.
A laptop
Familiarity with Jupyter notebooks and python code
Base understanding of Large Language Models
Base understanding of RAG systems
We recommend joining only if you can answer most of the following questions:
What is a token?
What does attention do in an LLM?
What does recall mean for a RAG system?
Why do RAG systems often use reranking?
Agenda:
1. Introduction: Brief overview of distil labs and their methodology
2. Hands-On Training: Guided session on training an SLM for question answering
3. Deployment Walkthrough: Step-by-step guide to deploying an SLM in a RAG system on your laptop.
This workshop is perfect for anyone eager to explore practical applications of NLP. See you there!
LIMITED SEATS AVAILABLE: Register now