
Learn the basics of Apache Airflow and how to run it with Google Cloud Composer, a fully managed orchestration service to run workflows that can span multiple cloud providers and on-premises data centers.

At DoiT International, we are committed to helping customers with their journey through the cloud. One of the common challenges our customers face is not having experience with a particular cloud service, needing to use it, and facing the prospect of having to overcome a steep clearing curve.
Google Cloud Composer is one of those services with a bit of a steep learning curve.
Cloud Composer is a fully managed orchestration service to run workflows that can span multiple cloud providers and on-premises data centers. Cloud Composer is built on Apache Airflow and uses Python for scripting purposes.
If you start from scratch, most people have to spend a lot of time switching between the docs for Cloud Composer and Apache Airflow.
To make it easier to get started with Cloud Composer, I have written a mini-book that provides a short course about the basics of Apache Airflow and how to run it with Cloud Composer on the Google Cloud Platform.
Get the mini-book
You can read my mini-book here:
📘 Getting started with Apache Airflow on Cloud Composer
The mini-book covers:
- Apache Airflow and its concepts
- Understanding Google Cloud Composer
- Setting up Airflow
- Building Airflow pipelines
- Writing DAGs
- Writing custom plugins
- Testing Airflow
- Continuous Integration (CI) and Continuous Delivery (CD)
The mini-book is open source (MIT) and available on GitHub. Contributions are more than welcome. Please feel free to make a GitHub pull request or open a new issue with any suggestions, comments or questions you have.
I hope this mini-book helps you on your journey with Cloud Composer and the cloud.
Stay tuned for future updates!