Cover Image for Databricks DevConnect | Chicago, IL
Cover Image for Databricks DevConnect | Chicago, IL
Avatar for DevConnect
Presented by
DevConnect
72 Going
Registration
Approval Required
Your registration is subject to approval by the host.
Welcome! To join the event, please register below.
About Event

🔊🌟 Calling all data engineers🌟🔊

Join us for Databricks DevConnect Chicago on Tuesday, August 19 from 5:00pm - 9:00pm CT!


Databricks DevConnect is a technical meetup designed for data engineers to experience an immersive evening of insights, collaborative learning, and community – hosted by the Databricks Developer Relations team. Bringing together data enthusiasts and innovators, we’ll explore the latest advancements in Databricks data engineering, AI, and analytics. You’ll have the chance to dive deep into technical discussions, share real-world experiences, and discover new ways to leverage Databricks’s technologies. Whether you’re a seasoned data engineer or just starting your journey, this event is designed to spark creativity and foster growth in our community. Let’s connect, learn, and push the boundaries of what’s possible with data – together!


Why You Can't Miss This Event:

  • ​🔓 Unlock the Power of Databricks: Discover how our unified platform can power your data engineering and AI initiatives.

  • ​🧠 Learn from Experts:  Gain actionable insights from Databricks Engineers, Product Managers, DevRel, and MVPs.

  • ​💻 Engaging Technical Sessions: Deepen technical expertise through interactive demos of the Databricks platform, and tools.

  • ​🤝 Network with the Community: Connect and build relationships with peers, MVPs, Databricks product managers, developer advocates, and engineers.

  • 👕 Access to exclusive content and goodies: Swag will be raffled off throughout the event and attendees will get access to hands-on training and guided labs through Databricks Academy Labs post event!


AGENDA

  • 5:00 PM: Registration & Mingling 

  • 6:00 PM: Welcome Remarks

  • 6:15 PM ➡️ Session #1: Achieving the Lakehouse Vision: Format Interoperability with Delta Lake Uniform and Unified Governance with Unity Catalog

  • 6:45 PM ➡️ Session #2: Agent Bricks: Auto Optimized from Pilot to Production

  • 7:15 PM ➡️ Session #3: Building Intelligent Data Pipelines with Lakeflow and AI

    • Franco Patano, Principal Specialist Solutions Architect, Databricks

  • 7:55 PM: Closing Remarks

  • 8:00 PM: Networking Reception

  • 9:00 PM: Good night


SESSION DESCRIPTIONS

  • Achieving the Lakehouse Vison: Format Interoperability with Delta Lake Uniform and Unified Governance with Unity Catalog: This technical session demonstrates how Databricks' Unity Catalog simplifies data integration and governance across Lakehouse architecture by removing silos between table formats and compute engines. Learn how Unity Catalog enables seamless connectivity to diverse data sources and sinks, providing full format interoperability, while empowering data engineers to implement robust governance and security measures. We will delve into leveraging open standards for data management, implementing fine-grained access controls, utilizing advanced lineage tracking, and establishing secure collaboration workflows across catalogs and compute engines. Discover how Unified Governance leads to successful AI strategy by unlocking the value of your data.

  • Agent Bricks: Auto Optimized from Pilot to Production: Agent Bricks is a new way to build AI agents - domain and task-specific, auto-optimized for quality, cost, and production-readiness. Developed by Mosaic AI Research, it removes the heavy lifting of tuning and evaluation so you can quickly go from prototype to production.We’ll walk through four types of agents: Information Extraction, Knowledge Assistant, Multi-Agent Supervisors, and Custom LLM Agents.Expect a live build and a look at how Agent Bricks makes deploying agents fast, practical, and low-effort. Worth checking out if you’re thinking about getting real-world value from agents without all the trial-and-error.

  • Building Intelligent Data Pipelines with Lakeflow and AI: Transform your data engineering workflow with AI-native pipeline development on the Databricks Data Intelligence Platform. Discover how artificial intelligence is revolutionizing data pipeline creation, enabling both technical and non-technical users to build sophisticated ETL processes through natural language interactions and intelligent automation. In this hands-on session, we’ll demonstrate Lakeflow Designer’s drag-and-drop interface, explore the new AI-assisted IDE for data engineering, and showcase real-world examples of organizations processing gigabytes of data per minute with minimal manual intervention. Whether you’re a seasoned data engineer looking to supercharge productivity or a business analyst building your first pipeline, you’ll learn how AI makes enterprise-grade data engineering accessible to everyone while maintaining production reliability and governance through Unity Catalog integration.


​SPEAKERS

Location
1 E Wacker Dr
Chicago, IL 60601, USA
Avatar for DevConnect
Presented by
DevConnect
72 Going