

Ragas Paper Club #1
Paper Club #1 – The Illusion of Thinking
Large Reasoning Models look smart—until problems get hard. The Apple ML paper we’re unpacking shows LRMs sailing through easy puzzles, improving with a bit more “thinking,” then collapsing once complexity crosses a line
by
On July 3 @ 09:30 AM PT we’ll dig into:
The puzzle sandbox that lets the authors dial problem complexity with surgical precision.
Why extra chain-of-thought can hurt on simple tasks and still fail on hard ones.
What “accuracy collapse” means for eval-driven development, benchmark design, and agents in prod.
A quick look at the response paper The Illusion of the Illusion of Thinking and open critiques.
Speakers
• Shahul – Founder, Ragas
• Nandu – CTO, Deepmost
20 min walkthrough → 15 min live Q&A.
Live and Free, on Zoom.
Slides, notes, and code links shared afterward.
See you in the chat!