The scenario is simple airflow DAG starts a cluster, submits a job and terminates cluster. The problem is that the cluster needs to be terminated always. E.g. even when previous task failed. If I add TriggerRule.ALL_DONE - I guarantee that cluster is terminated. But in that case DAG will be successful. If I change TriggerRule DAG will fail but cluster will be run.

So is there a way to tell airflow - execute task regardless DAG previous tasks failed or not and mark DAG as failed?

0

There are 0 best solutions below