Turning AI into Action: why LLMs + Iceberg + AWS = ❤️ and how to build AI products at scale
Join us for an immersive session at the San Francisco AWS Loft, where you will:
1. Pick up some tricks on using LLMs in end to end data workflows with Iceberg and AWS S3
2. Hear from an industry leader on lessons learned while building AI products at scale at Snowflake
👉 Please register on Luma and bring a valid photo ID upon arrival.
Agenda
5:30PM: Doors open, mingling over light refreshments
6:30PM: James Cha-Earley, "AI Data Stack in Action: An OSS Advantage with AWS"
7:00PM: Fireside chat with Anupam Datta, research leader @ Snowflake AI
7:30PM onwards: moar mingling
Session Details
Speaker: James Cha-Earley
Title: AI Data Stack in Action: An OSS Advantage with AWS
Snowflake Developer Advocate, James Cha-Earley, will reveal cutting-edge techniques to transform your AWS S3 data repositories into actionable insights using Snowflake Cortex LLMs and Apache Iceberg tables.
This session is for data engineers and practitioners looking to level up their AI data stack with real-world, scalable LLM capabilities.
You’ll explore how to:
Automate classification and entity extraction directly within your data workflows
Generate natural language insights from complex trends and anomalies
Build responsive, conversational interfaces on top of your warehouse
Apply AI-powered data quality checks at scale
Use multiple LLMs with Iceberg tables for high-performance querying across large datasets
Speakers: James Cha-Earley & Anupam Datta
Title: Fireside chat on building AI products at scale