

Ensuring Accurate AI Agent Paths through Tracing and Evaluation
Modern AI agents don’t always follow a straight path — and they don’t always get it right. As agent-based applications move from prototype to production, observability becomes critical to ensuring correct, efficient, and trustworthy behaviors.
Join Arize AI, AWS, and CrewAI for an evening focused on tracing and evaluating AI agent workflows. We’ll explore real-world techniques for gaining visibility into agent decisions, verifying tool usage and parameters, and implementing feedback loops that drive continual improvement. Learn how to debug, optimize, and monitor agents in dynamic, non-deterministic environments with the help of modern observability tools and agent frameworks.
Whether you're building with Amazon Bedrock, CrewAI, or open-source frameworks like Strands, this event is for developers building the next generation of agentic systems.
5:30 PM – 6:30 PM
Registration & Networking
6:30 PM – 6:45 PM
The Case for Agent Observability
Jason Lopatecki, CEO of Arize
6:45 PM – 7:00 PM
Deploying Enterprise Agents Into Production
Joao Moura, Co-Founder CrewAI
7:00 PM – 7:30 PM
Instrumenting and Evaluating AI Agents at Scale
AWS + Arize + CrewAI
7:30 PM – 8:30 PM
Networking Reception