Airflow can be set up to send metrics to StatsD:
[scheduler]
statsd_on = True
statsd_host = localhost
statsd_port = 8125
statsd_prefix = airflow
Name |
Description |
<job_name>_start |
Number of started <job_name> job, ex. SchedulerJob, LocalTaskJob |
<job_name>_end |
Number of ended <job_name> job, ex. SchedulerJob, LocalTaskJob |
operator_failures_<operator_name> |
Operator <operator_name> failures |
operator_successes_<operator_name> |
Operator <operator_name> successes |
ti_failures |
Overall task instances failures |
ti_successes |
Overall task instances successes |
zombies_killed |
Zombie tasks killed |
scheduler_heartbeat |
Scheduler heartbeats |
Name |
Description |
collect_dags |
Seconds taken to scan and import DAGs |
dagbag_import_errors |
DAG import errors |
dagbag_size |
DAG bag size |
Name |
Description |
dagrun.dependency-check.<dag_id> |
Seconds taken to check DAG dependencies |