Cloud Composer 1 | Cloud Composer 2 | Cloud Composer 3
This page describes different ways to trigger DAGs in Cloud Composer environments.
Airflow provides the following ways to trigger a DAG:
Trigger on a schedule. When you create a DAG, you specify a schedule for it. Airflow triggers the DAG automatically based on the specified scheduling parameters.
Trigger manually. You can trigger a DAG manually from Google Cloud console, Airflow UI, or by running an Airflow CLI command from
gcloud
.Trigger in response to events. The standard way to trigger a DAG in response to events is to use a sensor.
Other ways to trigger DAGs:
Trigger programmatically. You can trigger a DAG using the Airflow REST API. For example, from a Python script.
Trigger programmatically in response to events. You can trigger DAGs in response to events by using Cloud Run functions and the Airflow REST API.
Trigger a DAG on a schedule
To trigger a DAG on a schedule:
- Specify the
start_date
andschedule_interval
parameters in the DAG file, as described later in this section. - Upload the DAG file to your environment.
Specify scheduling parameters
When you define a DAG, in the schedule_interval
parameter, you specify how
often you want to run the DAG. In the start_date
parameter, you specify when
you want Airflow to start scheduling your DAG. Tasks in your DAG can have
individual start dates, or you can specify a single start date for all tasks.
Based on the minimum start date for tasks in your DAG and on the schedule
interval, Airflow schedules DAG runs.
Scheduling works in the following way. After the start_date
passes, Airflow
waits for the following occurrence of schedule_interval
. Then it schedules
the first DAG run to happen at the end of this schedule interval. For example,
if a DAG is scheduled to run every hour (schedule_interval
is 1 hour) and
the start date is at 12:00 today, the first DAG run happens at 13:00 today.
The following example shows a DAG that runs every hour starting from 15:00 on April 5, 2024. With the parameters used in the example, Airflow schedules the first DAG run to happen at 16:00 on April 5, 2024.
from datetime import datetime
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
with DAG(
dag_id='example_dag_schedule',
# At 15:00 on 5 April, 2024
start_date=datetime(2024, 4, 5, 15, 0),
# At minute 0 of every hour
schedule_interval='0 * * * *') as dag:
# Output the current date and time
t1 = BashOperator(
task_id='date',
bash_command='date',
dag=dag)
t1
For more information about the scheduling parameters, see DAG Runs in the Airflow documentation.
More scheduling parameter examples
Following scheduling parameter examples illustrate how scheduling works with different combinations of parameters:
If
start_date
isdatetime(2024, 4, 4, 16, 25)
andschedule_interval
is30 16 * * *
, then the first DAG run happens at 16:30 on 5 April, 2024.If
start_date
isdatetime(2024, 4, 4, 16, 35)
andschedule_interval
is30 16 * * *
, then the first DAG run happens at 16:30 on 6 April, 2024. Because the start date is after the schedule interval on 4 April, 2024, the DAG run does not happen on 5 April, 2024. Instead, the schedule interval ends at 16:35 on 5 April, 2024, so the next DAG run is scheduled for 16:30 on the following day.If
start_date
isdatetime(2024, 4, 4)
, and theschedule_interval
is@daily
, then the first DAG run is scheduled for 00:00 on April 5, 2024.If
start_date
isdatetime(2024, 4, 4, 16, 30)
, and theschedule_interval
is0 * * * *
, then the first DAG run is scheduled for 18:00 on April 4, 2024. After the specified date and time passes, Airflow schedules a DAG run to happen at the minute 0 of every hour. The nearest point in time when this happens is 17:00. At this time, Airflow schedules a DAG run to happen at the end of the schedule interval, that is, at 18:00.
Trigger a DAG manually
When you trigger a DAG manually, Airflow performs a DAG run. For example, if you have a DAG that already runs on a schedule, and you trigger this DAG manually, then Airflow executes your DAG once, independently from the actual schedule specified for the DAG.
Console
DAG UI is supported in Cloud Composer 2.0.1 and later versions.
To trigger a DAG from Google Cloud console:
In the Google Cloud console, go to the Environments page.
Select an environment to view its details.
On the Environment details page, go to the DAGs tab.
Click the name of a DAG.
On the DAG details page, click Trigger DAG. A new DAG run is created.
Airflow UI
To trigger a DAG from the Airflow web interface:
- In the Google Cloud console, go to the Environments page.
In the Airflow webserver column, follow the Airflow link for your environment.
Log in with the Google account that has the appropriate permissions.
In the Airflow web interface, on the DAGs page, in the Links column for your DAG, click the Trigger Dag button.
(Optional) Specify the DAG run configuration.
Click Trigger.
gcloud
Run the dags trigger
Airflow CLI command:
gcloud composer environments run ENVIRONMENT_NAME \
--location LOCATION \
dags trigger -- DAG_ID
Replace:
ENVIRONMENT_NAME
with the name of the environment.LOCATION
with the region where the environment is located.DAG_ID
with the name of the DAG.
For more information about running Airflow CLI commands in Cloud Composer environments, see Running Airflow CLI commands.
For more information about the available Airflow CLI commands, see
the gcloud composer environments run
command reference.