airflow and argo-workflows
These are competitors offering different approaches to workflow orchestration: Airflow is a Python-based DAG scheduler that can run on various infrastructure, while Argo Workflows is a Kubernetes-native engine that submits containerized tasks directly to K8s clusters.
About airflow
apache/airflow
Apache Airflow - A platform to programmatically author, schedule, and monitor workflows
Defines workflows as directed acyclic graphs (DAGs) using Python code, enabling version control and testing of data pipelines. The scheduler distributes task execution across worker nodes while enforcing dependencies, with a web UI for pipeline visualization and monitoring. Includes 500+ pre-built operators and hooks for integrating with cloud platforms (AWS, GCP, Azure), databases, and data processing frameworks like Spark and Kubernetes.
About argo-workflows
argoproj/argo-workflows
Workflow Engine for Kubernetes
Implemented as a Kubernetes CRD, it enables declarative workflow definition using DAGs or step sequences with native support for artifact management (S3, GCS, Azure Blob Storage), cron scheduling, and parallel job execution. The project integrates deeply with the Kubernetes ecosystem through SDKs (Python Hera, Java, Go, TypeScript), REST/gRPC APIs, and complementary tools like Argo Events, Kubeflow Pipelines, and Metaflow for ML and data processing pipelines.
Scores updated daily from GitHub, PyPI, and npm data. How scores work