Cloud Composer 3 | Cloud Composer 2 | Cloud Composer 1
This page explains how to transfer DAGs, data and configuration from your existing Cloud Composer 1, Airflow 1 environments to Cloud Composer 2, Airflow 2.
This migration guide uses the Snapshots feature.
Other migration guides
From | To | Method | Guide |
---|---|---|---|
Cloud Composer 1, Airflow 2 | Cloud Composer 2, Airflow 2 | Side-by-side, using snapshots | Migration guide (snapshots) |
Cloud Composer 1, Airflow 1 | Cloud Composer 2, Airflow 2 | Side-by-side, using snapshots | This guide (snapshots) |
Cloud Composer 1, Airflow 2 | Cloud Composer 2, Airflow 2 | Side-by-side, manual transfer | Manual migration guide |
Cloud Composer 1, Airflow 1 | Cloud Composer 2, Airflow 2 | Side-by-side, manual transfer | Manual migration guide |
Airflow 1 | Airflow 2 | Side-by-side, manual transfer | Manual migration guide |
Before you begin
Snapshots are supported in Cloud Composer 2 version 2.0.9 and later. Cloud Composer 1 supports saving environment snapshots in 1.18.5.
Cloud Composer supports side-by-side migration from Cloud Composer 1 to Cloud Composer 2. It is not possible to upgrade from Cloud Composer 1 to Cloud Composer 2 in-place.
Check the list of differences between Cloud Composer 1 and Cloud Composer 2.
The maximum size of the Airflow database that supports snapshots is 20 GB. If your environment's database takes more than 20 GB, reduce the size of the Airflow database.
The total number of objects in the
/dags
,/plugins
and/data
folders in the environment's bucket must be less than 100,000 to create snapshots.If you use the XCom mechanism to transfer files, make sure that you use it according to Airflow's guidelines. Transferring big files or a large number of files using XCom impacts Airflow database's performance and can lead to failures when loading snapshots or upgrading your environment. Consider using alternatives such as Cloud Storage to transfer large volumes of data.
Because Cloud Composer 2 uses Airflow 2, the migration includes switching your DAGs and environment configuration to Airflow 2. Check the migration guide from Airflow 1 to Airflow 2 for information about the breaking changes between Airflow 1 and Airflow 2 in Cloud Composer.
In this guide, you combine migration to Airflow 2 and migration to Cloud Composer 2 in one migration procedure. In this way, you do not need to migrate to a Cloud Composer 1 environment with Airflow 2 before migrating to Cloud Composer 2.
Step 1: Upgrade to Airflow 1.10.15
If your environment uses an Airflow version earlier than 1.10.15, upgrade your environment to a Cloud Composer version that uses Airflow 1.10.15 and supports snapshots.
Step 2: Check compatibility with Airflow 2
To check for potential conflicts with Airflow 2, consult the Upgrading to Airflow 2.0+ guide, in the section about upgrading DAGs.
One common issue that you might encounter is related to incompatible import paths. For more information about solving this compatibility issue, in the Upgrading to Airflow 2.0+ guide, see the section about backport providers.
Step 3: Make sure that your DAGs are ready for Airflow 2
Before transferring DAGs to your Cloud Composer 2 environment, make sure that:
Your DAGs run successfully and there are no remaining compatibility issues.
Your DAGs use correct import statements.
For example, the new import statement for
BigQueryCreateDataTransferOperator
can look like this:from airflow.providers.google.cloud.operators.bigquery_dts \ import BigQueryCreateDataTransferOperator
Your DAGs are upgraded for Airflow 2. This change is compatible with Airflow 1.10.14 and later versions.
Step 4: Pause DAGs in your Cloud Composer 1 environment
To avoid duplicate DAG runs, pause all DAGs in your Cloud Composer 1 environment before saving its snapshot.
You can use any of the following options:
In the Airflow web interface, go to DAGs and pause all DAGs manually.
Use the composer_dags script to pause all DAGs:
python3 composer_dags.py --environment COMPOSER_1_ENV \ --project PROJECT_ID \ --location COMPOSER_1_LOCATION \ --operation pause
Replace:
COMPOSER_1_ENV
with the name of your Cloud Composer 1 environment.PROJECT_ID
with the Project ID.COMPOSER_1_LOCATION
with the region where the environment is located.
Step 5: Save the snapshot of your Cloud Composer 1 environment
Console
Create a snapshot of your environment:
In Google Cloud console, go to the Environments page.
In the list of environments, click the name of your Cloud Composer 1 environment. The Environment details page opens.
Click Create snapshot.
In the Create snapshot dialog, click Submit. In this guide, you save the snapshot in the Cloud Composer 1 environment's bucket, but you can select a different location, if you want to.
Wait until Cloud Composer creates the snapshot.
gcloud
Get your Cloud Composer 1 environment's bucket URI:
Run the following command:
gcloud composer environments describe COMPOSER_1_ENV \ --location COMPOSER_1_LOCATION \ --format="value(config.dagGcsPrefix)"
Replace:
COMPOSER_1_ENV
with the name of your Cloud Composer 1 environment.COMPOSER_1_LOCATION
with the region where the environment is located.
In the output, remove the
/dags
folder. The result is the URI of your Cloud Composer 1 environment's bucket.For example, change
gs://us-central1-example-916807e1-bucket/dags
togs://us-central1-example-916807e1-bucket
.
Create a snapshot of your Cloud Composer 1 environment:
gcloud composer environments snapshots save \ COMPOSER_1_ENV \ --location COMPOSER_1_LOCATION \ --snapshot-location "COMPOSER_1_SNAPSHOTS_FOLDER"
Replace:
COMPOSER_1_ENV
with the name of your Cloud Composer 1 environment.COMPOSER_1_LOCATION
with the region where the Cloud Composer 1 environment is located.COMPOSER_1_SNAPSHOTS_FOLDER
with the URI of your Cloud Composer 1 environment's bucket. In this guide, you save the snapshot in the Cloud Composer 1 environment's bucket, but you can select a different location, if you want to. If you specify a custom location, the service accounts of both environments must have read and write permissions for the specified location.
Step 6: Create a Cloud Composer 2 environment
Create a Cloud Composer 2 environment. You can start with an environment preset that matches your expected resource demands, and later scale and optimize your environment further.
You do not need to specify configuration overrides and environment variables, since you replace them later when you load the snapshot of your Cloud Composer 1 environment.
Some configuration options from Airflow 1 use a different name and section in Airflow 2. For more information, see Configuration changes.
Step 7: Load the snapshot to your Cloud Composer 2 environment
Console
To load the snapshot to your Cloud Composer 2 environment:
In Google Cloud console, go to the Environments page.
In the list of environments, click the name of your Cloud Composer 2 environment. The Environment details page opens.
Click Load snapshot.
In the Load snapshot dialog, click Browse.
Select the folder with the snapshot. If you use the default location for this guide, this folder is located in your Cloud Composer 1 environment bucket in the
/snapshots
folder, and its name is the timestamp of the snapshot save operation. For example,us-central1-example-916807e1-bucket/snapshots_example-project_us-central1_example-environment/2022-01-05T18-59-00
.Click Load and wait until Cloud Composer loads the snapshot.
gcloud
Load the snapshot of your Cloud Composer 1 environment to your Cloud Composer 2 environment:
gcloud composer environments snapshots load \
COMPOSER_2_ENV \
--location COMPOSER_2_LOCATION \
--snapshot-path "SNAPSHOT_PATH"
Replace:
COMPOSER_2_ENV
with the name of your Cloud Composer 2 environment.COMPOSER_2_LOCATION
with the region where the Cloud Composer 2 environment is located.SNAPSHOT_PATH
with the URI of your Cloud Composer 1 environment's bucket, followed by the path to the snapshot. For example,gs://us-central1-example-916807e1-bucket/snapshots/example-project_us-central1_example-environment_2022-01-05T18-59-00
.
Step 8: Unpause DAGs in the Cloud Composer 2 environment
You can use any of the following options:
In the Airflow web interface, go to DAGs and unpause all DAGs manually one by one.
Use the composer_dags script to unpause all DAGs:
python3 composer_dags.py --environment COMPOSER_2_ENV \ --project PROJECT_ID \ --location COMPOSER_2_LOCATION \ --operation unpause
Replace:
COMPOSER_2_ENV
with the name of your Cloud Composer 2 environment.PROJECT_ID
with the Project ID.COMPOSER_2_LOCATION
with the region where the environment is located.
(Airflow versions 2.9.1 and later) If there are quota errors while unpausing a large number of DAGs, you can use the following Airflow CLI commands to unpause all DAGs at once:
gcloud composer environments run COMPOSER_2_ENV dags unpause \ --project PROJECT_ID \ --location COMPOSER_2_LOCATION \ -- -y --treat-dag-id-as-regex ".*"
(Airflow versions earlier than 2.9.1) If there are quota errors while unpausing a large number of DAGs, it's possible to unpause DAGs using the Airflow REST API. Also see Trying the API in the Airflow documentation.
Step 9: Check for DAG errors
In the Airflow web interface, go to DAGs and check for reported DAG syntax errors.
Check that DAG runs are scheduled at the correct time.
Wait for the DAG runs to happen in the Cloud Composer 2 environment and check if they were successful. If a DAG run was successful, do not unpause it in the Cloud Composer 1 environment; if you do so, a DAG run for the same time and date happens in your Cloud Composer 1 environment.
If a specific DAG runs fails, attempt to troubleshoot the DAG until it successfully runs in Cloud Composer 2.
Step 10: Monitor your Cloud Composer 2 environment
After you transfer all DAGs and configuration to the Cloud Composer 2 environment, monitor it for potential issues, failed DAG runs, and overall environment health.
If the Cloud Composer 2 environment runs without problems for a sufficient period of time, consider deleting the Cloud Composer 1 environment.
What's next
- Troubleshooting DAGs
- Troubleshooting environment creation
- Troubleshooting environment updates
- Using backport packages