Cover Image for LLM Engineering Foundations: The Transformer

LLM Engineering Foundations: The Transformer

 
 
YouTube
Registration
Past Event
Welcome! To join the event, please register below.
About Event

Join us for a live workshop on LLM Engineering Foundations: The Transformer!

The way that Large Language Models process text is not well-understood by many AI practitioners.

During this event, you’ll develop an intuition for how transformers work, from attention heads to tokens, embeddings, and parameters. You’ll also learn to differentiate between BART, BERT, and GPT-style models, and to choose the right style of model for the right use case. Finally, we’ll demonstrate exactly how to leverage and evaluate each style of transformer model directly in Python code.

Special thanks to Coding Temple for partnering with us on this event!

All attendees can follow along with us directly in Google Colab during the event.


Speakers:
- Dr. Greg Loughnane is the Founder & CEO of AI Makerspace, where he serves as an instructor for their LLM Engineering and LLM Ops: LLMs in Production courses. Since 2021 he has built and led industry-leading Machine Learning & AI boot camp programs.  Previously, he worked as an AI product manager, a university professor teaching AI, an AI consultant and startup advisor, and ML researcher.  He loves trail running and is based in Dayton, Ohio.

- Chris Alexiuk, is the Co-Founder & CTO at AI Makerspace, where he serves as an instructor for their LLM Engineering and LLM Ops: LLMs in Production courses. A former Data Scientist, he also works as the Founding Machine Learning Engineer at Ox. As an experienced online instructor, curriculum developer, and YouTube creator, he’s always learning, building, shipping, and sharing his work! He loves Dungeons & Dragons and is based in Toronto, Canada.