

How to write data pipelines with Apache Airflow® 3.0
Get started with Airflow 3.0-Timothy Davis
About the Event
Apache Airflow® is an open source tool used for authoring, scheduling, and monitoring data pipelines in Python. It is the de facto standard for data orchestration, used by 77,000+ organizations world-wide for everything from advanced analytics to production AI + MLOps.
Airflow 3.0 is the biggest release in the project’s history, and includes features like a new UI making Airflow easier to use than ever, stronger security, and greater flexibility to run tasks anywhere at any time.
In this hands-on webinar, we’ll cover everything you need to know to get started writing data pipelines with Airflow 3.0, including:
Basic Airflow concepts and terminology
How to run Airflow locally with the Astro CLI
How to use new 3.0 features including DAG versioning, backfills, and assets
Key use cases that Airflow 3 enables
Tips on upgrade preparedness and the most important breaking changes for existing Airflow users
About the speaker:
Tim is a Developer Advocate with years of experience in infrastructure, operations, DevOps, and data. He has spoken at many large global conferences and small community meetups alike, and has a plethora of written and recorded content online across numerous niche’s of the tech industry.
DataTalks.Club is the place to talk about data. Join our slack community!
This event is sponsored by Astronomer