

MLOps Reading Group April - A-MEM: Agentic Memory for LLM Agents
How do we make LLM agents truly production-ready—capable of holding consistent, useful conversations over time, across multiple sessions?
That’s exactly what we’ll explore in the next MLOps Community Reading Group session, featuring the paper:
📄 “Mem0: Building Production-Ready AI Agents with Scalable Long-Term Memory”
And this time, we’re bringing in someone very special:
🎙️ Prateek Chhikara, Founding AI Engineer @ Mem0 and co-author of the paper, will be joining us live to share insights, challenges, and lessons learned from building Mem0 in the real world.
🧠 Why Mem0 matters:
Memory is one of the thorniest challenges in deploying LLMs. Mem0 introduces a scalable long-term memory architecture that dynamically extracts, consolidates, and retrieves key information from conversations. By using graph-based structures to model relationships between conversational elements, Mem0 enables AI agents that are more accurate, coherent, and production-ready.
✅ What you’ll get from this session:
🔹 A deeper understanding of persistent memory design for LLM agents
🔹 Insights into how Mem0 outperforms traditional memory systems in real-world scenarios
🔹 A chance to ask the paper’s co-author your burning questions
🔹 A front-row seat to where LLM agent design is headed next
📅 DATE: Thursday, May 29
🕚 TIME: 11 AM ET
Join the #reading-group channel in the MLOps Community Slack to connect before and after the session. We meet every month—don’t miss this one.