The Next Token: How LLMs Predict
In this event, we’ll discuss how exactly an LLM predicts the next token.
In other words, we’ll answer “How does a model predict the next token during generation?”
Assuming a naive, greedy implementation where every single token can be selected, it is simply the most likely token that comes next.
Join us live to learn how to think about choosing tokens from distributions of tokens that best represent our data, and learn about loss functions and their critical importance within our models.
We will do our best to root our discussion in the fundamentals of machine learning and LLM application building.
Upon this fundamental action of next-token prediction, we can grow our understanding upward (e.g., toward constraining and optimizing LLM output) and downward (e.g., toward unsupervised pretraining).
This should be fun!
Speakers:
Dr. Greg” Loughnane is the Co-Founder & CEO of AI Makerspace, where he is an instructor for their AI Engineering Bootcamp. Since 2021 he has built and led industry-leading Machine Learning education programs. Previously, he worked as an AI product manager, a university professor teaching AI, an AI consultant and startup advisor, and an ML researcher. He loves trail running and is based in Dayton, Ohio.
Chris “The Wiz” Alexiuk is the Co-Founder & CTO at AI Makerspace, where he is an instructor for their AI Engineering Bootcamp. During the day, he is also a Developer Advocate at NVIDIA. Previously, he was a Founding Machine Learning Engineer, Data Scientist, and ML curriculum developer and instructor. He’s a YouTube content creator YouTube who’s motto is “Build, build, build!” He loves Dungeons & Dragons and is based in Toronto, Canada.
Follow AI Makerspace on LinkedIn and YouTube to stay updated about workshops, new courses, and corporate training opportunities.