


Databricks DevConnect | Chicago, IL
โ๐๐ Calling all data engineers๐๐
โโJoin us for Databricks DevConnect Chicago on Tuesday, August 19 from 5:00pm - 9:00pm CT!
โDatabricks DevConnect is a technical meetup designed for data engineers to experience an immersive evening of insights, collaborative learning, and community โ hosted by the Databricks Developer Relations team. Bringing together data enthusiasts and innovators, weโll explore the latest advancements in Databricks data engineering, AI, and analytics. Youโll have the chance to dive deep into technical discussions, share real-world experiences, and discover new ways to leverage Databricksโs technologies. Whether youโre a seasoned data engineer or just starting your journey, this event is designed to spark creativity and foster growth in our community. Letโs connect, learn, and push the boundaries of whatโs possible with data โ together!
โโWhy You Can't Miss This Event:
โโ๐ Unlock the Power of Databricks: Discover how our unified platform can power your data engineering and AI initiatives.
โโ๐ง Learn from Experts: ย Gain actionable insights from Databricks Engineers, Product Managers, DevRel, and MVPs.
โโ๐ป Engaging Technical Sessions: Deepen technical expertise through interactive demos of the Databricks platform, and tools.
โโ๐ค Network with the Community: Connect and build relationships with peers, MVPs, Databricks product managers, developer advocates, and engineers.
โ๐ Access to exclusive content and goodies: Swag will be raffled off throughout the event and attendees will get access to hands-on training and guided labs through Databricks Academy Labs post event!
โโAGENDA
โโ5:00 PM: Registration & Minglingย
โ6:00 PM: Welcome Remarks
โโ6:15 PM โก๏ธ Session #1: Achieving the Lakehouse Vision: Format Interoperability with Delta Lake Uniform and Unified Governance with Unity Catalog
โSpencer Cook, Lead Solutions Architect, Databricks
โโ6:45 PM โก๏ธ Session #2: Agent Bricks: Auto Optimized from Pilot to Production
โLuke Gardner, Solutions Architect, Databricks
โ7:15 PM โก๏ธ Session #3: Building Intelligent Data Pipelines with Lakeflow and AI
โFranco Patano, Principal Specialist Solutions Architect, Databricks
โโ7:55 PM: Closing Remarks
โโ8:00 PM: Networking Reception
โโ9:00 PM: Good night
โโSESSION DESCRIPTIONS
โโAchieving the Lakehouse Vison: Format Interoperability with Delta Lake Uniform and Unified Governance with Unity Catalog: This technical session demonstrates how Databricks' Unity Catalog simplifies data integration and governance across Lakehouse architecture by removing silos between table formats and compute engines. Learn how Unity Catalog enables seamless connectivity to diverse data sources and sinks, providing full format interoperability, while empowering data engineers to implement robust governance and security measures. We will delve into leveraging open standards for data management, implementing fine-grained access controls, utilizing advanced lineage tracking, and establishing secure collaboration workflows across catalogs and compute engines. Discover how Unified Governance leads to successful AI strategy by unlocking the value of your data.
โAgent Bricks: Auto Optimized from Pilot to Production: Agent Bricks is a new way to build AI agents - domain and task-specific, auto-optimized for quality, cost, and production-readiness. Developed by Mosaic AI Research, it removes the heavy lifting of tuning and evaluation so you can quickly go from prototype to production.Weโll walk through four types of agents: Information Extraction, Knowledge Assistant, Multi-Agent Supervisors, and Custom LLM Agents.Expect a live build and a look at how Agent Bricks makes deploying agents fast, practical, and low-effort. Worth checking out if youโre thinking about getting real-world value from agents without all the trial-and-error.
โBuilding Intelligent Data Pipelines with Lakeflow and AI: Transform your data engineering workflow with AI-native pipeline development on the Databricks Data Intelligence Platform. Discover how artificial intelligence is revolutionizing data pipeline creation, enabling both technical and non-technical users to build sophisticated ETL processes through natural language interactions and intelligent automation. In this hands-on session, weโll demonstrate Lakeflow Designerโs drag-and-drop interface, explore the new AI-assisted IDE for data engineering, and showcase real-world examples of organizations processing gigabytes of data per minute with minimal manual intervention. Whether youโre a seasoned data engineer looking to supercharge productivity or a business analyst building your first pipeline, youโll learn how AI makes enterprise-grade data engineering accessible to everyone while maintaining production reliability and governance through Unity Catalog integration.
โโSPEAKERS
โFranco Patano, Principal Specialist Solutions Architect, Databricks
โSpencer Cook, Lead Solutions Architect, Databricks
โLuke Gardner, Solutions Architect, Databricks