Skip to content

Add trigger_rule to email_user task in data exports selection DAGs #1672

@shelleydoljack

Description

@shelleydoljack

The default trigger_rule in Airflow is all_success. During a dag run on stage, the email_user task failed and I believe this caused the fetch_marc_records_from_folio task to fail. This was in the log file for that task:

[2025-11-25, 22:01:42 UTC] {local_task_job_runner.py:346} WARNING - State of this instance has been externally set to failed. Terminating instance.
[2025-11-25, 22:01:42 UTC] {local_task_job_runner.py:245} ▲▲▲ Log group end
[2025-11-25, 22:01:42 UTC] {process_utils.py:132} INFO - Sending 15 to group 3800. PIDs of all processes in the group: [3800]
[2025-11-25, 22:01:42 UTC] {process_utils.py:87} INFO - Sending the signal 15 to group 3800
[2025-11-25, 22:01:42 UTC] {taskinstance.py:3117} ERROR - Received SIGTERM. Terminating subprocesses.
[2025-11-25, 22:01:42 UTC] {taskinstance.py:3118} ERROR - Stacktrace: 
  File "/home/airflow/.local/bin/airflow", line 8, in <module>
    sys.exit(main())

Maybe the email_user task is considered an upstream task for fetch_march_records_from_folio? Or because the trigger_rule for that task is "all_success", when it fails it causes dowstream tasks to fail?

Airflow trigger rule docs

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions