

MiniMax-M1 Technical Seminar
The world's first open-weight, large-scale hybrid-attention reasoning model
Join us for the inaugural technical seminar on MiniMax-M1 — the world’s leading open-source LLM for long-context reasoning, featuring a groundbreaking 1 million-token input and 80K-token output window. This session offers an exclusive opportunity to connect with top minds in AI and gain deep insights into the architecture, performance, and use cases of M1.
What to Expect
This online seminar will be the first official deep dive into the core innovations behind M1. Members of the development team will present technical overviews covering model design, inference performance, and practical applications. The event will also include panel discussions and a live Q&A session to foster open exchange within the global AI community.
📢 Speakers & Panelists
Junjie Yan – CEO/CTO, MiniMax
Jun Qing - Technical Staff, MiniMax
Jiayuan Song – Technical Staff, MiniMax
Shunyu Yao – Research Scientist, Anthropic
Junxian He – Assistant Professor, HKUST
Wenhu Chen – Assistant Professor, University of Waterloo
Songlin Yang – PhD Student, MIT CSAIL
Arthur Zucker – Head of Transformers, Hugging Face
Kaichao You – Core Developer, vLLM
Akim Tsvigun - Senior ML Solutions Architect, Nebius
Arjun Krishna - AI Researcher, Writer
Liangsheng Yin - Core Developer, SGLang
🧠 Agenda
Technical Keynote
Model Architecture & Algorithm Design – 20 min
Inference Capabilities & Real-World Applications – 20 min
Panel Discussions
Model & Algorithm – 20 min
Inference – 20 min
Q&A Session
Open Discussion and Audience Questions – 15–20 min
📅 Date & Time
Thursday, July 10, 2025
4:00 PM (PST)
7:00 PM (EST)
7:00 AM (CST/Beijing Time, July 11)
Format: Zoom (Online Only) / Limited seats
Get to know MiniMax-M1 at Hugging Face, GitHub, and Tech Report