apache/airflow
Apache Airflow - A platform to programmatically author, schedule, and monitor workflows
Defines workflows as directed acyclic graphs (DAGs) using Python code, enabling version control and testing of data pipelines. The scheduler distributes task execution across worker nodes while enforcing dependencies, with a web UI for pipeline visualization and monitoring. Includes 500+ pre-built operators and hooks for integrating with cloud platforms (AWS, GCP, Azure), databases, and data processing frameworks like Spark and Kubernetes.
44,620 stars and 18,613,997 monthly downloads. Used by 3 other packages. Actively maintained with 910 commits in the last 30 days. Available on PyPI.
Stars
44,620
Forks
16,685
Language
Python
License
Apache-2.0
Category
Last pushed
Mar 13, 2026
Monthly downloads
18,613,997
Commits (30d)
910
Dependencies
2
Reverse dependents
3
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/mlops/apache/airflow"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Related tools
mlrun/mlrun
MLRun is an open source MLOps platform for quickly building and managing continuous ML...
clearml/clearml
ClearML - Auto-Magical CI/CD to streamline your AI workload. Experiment Management, Data...
argoproj-labs/hera
Hera makes Python code easy to orchestrate on Argo Workflows through native Python integrations....
polyaxon/haupt
Lineage metadata API, artifacts streams, sandbox, API, and spaces for Polyaxon
argoproj/argo-workflows
Workflow Engine for Kubernetes