Pada 15 September 2026, semua lingkungan Cloud Composer 1 dan Cloud Composer 2 versi 2.0.x akan mencapai akhir masa pakainya yang direncanakan, dan Anda tidak akan dapat menggunakannya. Sebaiknya rencanakan migrasi ke Cloud Composer 3.
Halaman ini menjelaskan cara memperbarui lingkungan.
Tentang operasi update
Saat Anda mengubah parameter lingkungan, seperti menentukan parameter penskalaan dan performa baru, atau menginstal paket PyPI kustom, lingkungan Anda akan diperbarui.
Setelah operasi ini selesai, perubahan akan tersedia di lingkungan Anda.
Untuk satu lingkungan Cloud Composer, Anda hanya dapat memulai satu
operasi update dalam satu waktu. Anda harus menunggu hingga operasi update selesai
sebelum memulai operasi lingkungan lainnya.
Pengaruh update terhadap tugas Airflow yang sedang berjalan
Saat Anda menjalankan operasi update, penjadwal dan
pekerja Airflow di lingkungan Anda mungkin memerlukan restart. Dalam hal ini, semua
tugas yang sedang berjalan akan dihentikan. Setelah operasi update selesai, Airflow menjadwalkan ulang tugas ini, bergantung pada cara Anda mengonfigurasi percobaan ulang untuk DAG.
Perubahan berikut menyebabkan penghentian tugas Airflow:
Mengupgrade lingkungan Anda ke versi baru.
Menambahkan, mengubah, atau menghapus paket PyPI kustom.
Mengubah variabel lingkungan Cloud Composer.
Menambahkan atau menghapus opsi konfigurasi Airflow akan menggantikan, atau mengubah
nilainya.
Mengubah CPU, memori, atau penyimpanan pekerja Airflow.
Mengurangi jumlah maksimum pekerja Airflow, jika nilai baru lebih rendah daripada jumlah pekerja yang sedang berjalan. Misalnya, jika lingkungan saat ini menjalankan tiga pekerja, dan maksimum dikurangi menjadi dua.
Perubahan berikut tidak menyebabkan penghentian tugas Airflow:
Membuat, memperbarui, atau menghapus DAG (bukan operasi update).
Menjeda atau melanjutkan DAG (bukan operasi update).
Mengubah variabel Airflow (bukan operasi update).
Mengubah koneksi Airflow (bukan operasi update).
Mengaktifkan atau menonaktifkan integrasi Silsilah Data Katalog Universal Dataplex.
Mengubah ukuran lingkungan.
Mengubah jumlah penjadwal.
Mengubah CPU, memori, atau penyimpanan penjadwal Airflow.
Mengubah jumlah pemicu.
Mengubah CPU, memori, atau penyimpanan pemicu Airflow.
Mengubah CPU, memori, atau penyimpanan server web Airflow.
Menambah atau mengurangi jumlah minimum pekerja.
Mengurangi jumlah maksimum pekerja Airflow. Misalnya, jika
lingkungan saat ini menjalankan dua pekerja, dan maksimum dikurangi menjadi tiga.
Mengubah masa pemeliharaan.
Mengubah setelan snapshot terjadwal.
Mengubah label lingkungan.
Memperbarui dengan Terraform
Jalankan terraform plan sebelum terraform apply untuk melihat apakah Terraform membuat lingkungan baru, bukan memperbaruinya.
Sebelum memulai
Pastikan akun Anda, akun layanan lingkungan Anda, dan akun Agen Layanan Cloud Composer di project Anda memiliki izin yang diperlukan:
Perintah gcloud composer environments update akan berakhir saat
operasi selesai. Anda dapat menggunakan flag --async agar tidak perlu menunggu
operasi selesai.
Memperbarui lingkungan
Untuk mengetahui informasi selengkapnya tentang cara memperbarui lingkungan, lihat halaman dokumentasi lainnya tentang operasi update tertentu. Contoh:
RESOURCE_NAME dengan nama resource lingkungan Anda.
Me-roll back perubahan update
Dalam situasi yang jarang terjadi, operasi update mungkin terganggu
(misalnya, karena waktu tunggu habis) dan perubahan yang diminta mungkin tidak
di-roll back di semua komponen lingkungan (seperti server web Airflow).
Misalnya, operasi update mungkin menginstal atau menghapus modul PyPI tambahan, mendefinisikan ulang atau menentukan variabel lingkungan Airflow atau Cloud Composer baru, atau mengubah beberapa parameter terkait Airflow.
Situasi seperti ini dapat terjadi jika operasi update dipicu saat operasi lain sedang berlangsung, misalnya penskalaan otomatis cluster Cloud Composer atau operasi pemeliharaan.
Dalam situasi seperti itu, sebaiknya ulangi operasi.
Durasi operasi update atau upgrade
Durasi operasi update dan upgrade dipengaruhi oleh faktor-faktor berikut:
Sebagian besar operasi update atau upgrade memerlukan restart komponen Airflow seperti scheduler, worker, dan server web Airflow. Setelah dimulai ulang, komponen harus diinisialisasi. Selama inisialisasi, penjadwal dan pekerja Airflow mendownload konten folder /dags dan /plugins dari bucket lingkungan. Proses menyinkronkan file ke penjadwal dan worker Airflow tidak instan dan bergantung pada total ukuran dan jumlah semua objek dalam folder ini.
Sebaiknya simpan hanya file DAG dan plugin di folder /dags dan /plugins (masing-masing) dan hapus semua file lainnya. Terlalu banyak data di folder /dags dan /plugins dapat memperlambat inisialisasi komponen Airflow dan dalam kasus tertentu dapat membuat inisialisasi tidak mungkin dilakukan.
Sebaiknya simpan data kurang dari 30 MB di folder /dags dan /plugins, dan jangan melebihi ukuran data 100 MB. Untuk mengetahui informasi selengkapnya, lihat juga Menangani sejumlah besar DAG dan plugin
Ukuran database Airflow dapat meningkatkan waktu operasi upgrade secara signifikan. Sebaiknya pertahankan ukuran database Airflow dengan
mengonfigurasi kebijakan retensi database.
[[["Mudah dipahami","easyToUnderstand","thumb-up"],["Memecahkan masalah saya","solvedMyProblem","thumb-up"],["Lainnya","otherUp","thumb-up"]],[["Sulit dipahami","hardToUnderstand","thumb-down"],["Informasi atau kode contoh salah","incorrectInformationOrSampleCode","thumb-down"],["Informasi/contoh yang saya butuhkan tidak ada","missingTheInformationSamplesINeed","thumb-down"],["Masalah terjemahan","translationIssue","thumb-down"],["Lainnya","otherDown","thumb-down"]],["Terakhir diperbarui pada 2025-08-29 UTC."],[[["\u003cp\u003eThis page describes how to update Cloud Composer environments, noting that it currently reflects Cloud Composer 2 documentation.\u003c/p\u003e\n"],["\u003cp\u003eUpdating an environment can involve changes to parameters like scaling, performance, or custom PyPI packages, with only one update operation allowed at a time.\u003c/p\u003e\n"],["\u003cp\u003eCertain updates, such as upgrading the environment or modifying custom packages and configurations, will terminate all running Airflow tasks, while others like changes to DAGs or connections will not.\u003c/p\u003e\n"],["\u003cp\u003eUpdating with Terraform may result in the deletion and recreation of the environment if the parameter being changed is not supported for updates, and caution is advised.\u003c/p\u003e\n"],["\u003cp\u003eUpdate operations can be affected by the size of data in \u003ccode\u003e/dags\u003c/code\u003e and \u003ccode\u003e/plugins\u003c/code\u003e folders, and keeping it below 30MB, and never above 100 MB, is highly recommended to prevent slowdowns or initialization failures.\u003c/p\u003e\n"]]],[],null,["# Update Cloud Composer environments\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\n**Cloud Composer 3** \\| [Cloud Composer 2](/composer/docs/composer-2/update-environments \"View this page for Cloud Composer 2\") \\| [Cloud Composer 1](/composer/docs/composer-1/update-environments \"View this page for Cloud Composer 1\")\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\nThis page explains how an environment can be updated.\n\nAbout update operations\n-----------------------\n\nWhen you change parameters of your environment, such as specifying new scaling\nand performance parameters, or installing custom PyPI packages, your\nenvironment updates.\n\nAfter this operation is completed, changes become available in your\nenvironment.\n\nFor a single Cloud Composer environment, you can start only one\nupdate operation at a time. You must wait for an update operation to complete\nbefore starting another environment operation.\n\nHow updates affect running Airflow tasks\n----------------------------------------\n\n| **Caution:** Some update operations **terminate all running tasks**.\n\nWhen you [run an update operation](#update-operations), Airflow schedulers and\nworkers in your environment might require a restart. In this case, all\ncurrently running tasks are terminated. After the update operation is\ncompleted, Airflow schedules these tasks for a retry, depending on the way you\nconfigure retries for your DAGs.\n| **Note:** Airflow workers can get restarted as part of the environment maintenance, during the maintenance windows.\n\nThe following changes **cause** Airflow task termination:\n\n- Upgrading your environment to a new version.\n- Adding, changing, or deleting custom PyPI packages.\n- Changing Cloud Composer environment variables.\n- Adding or removing Airflow configuration options overrides, or changing their values.\n- Changing Airflow workers' CPU, memory, or storage.\n- Reducing the maximum number of Airflow workers, if the new value is\n lower than the number of currently running workers. For example, if an\n environment currently runs three workers, and the maximum is reduced to two.\n\nThe following changes **don't cause** Airflow task termination:\n\n- Creating, updating, or deleting a DAG (not an update operation).\n- Pausing or unpausing DAGs (not an update operation).\n- Changing Airflow variables (not an update operation).\n- Changing Airflow connections (not an update operation).\n- Enabling or disabling Dataplex Universal Catalog Data Lineage integration.\n- Changing environment's size.\n- Changing the number of schedulers.\n- Changing Airflow schedulers' CPU, memory, or storage.\n- Changing the number of triggerers.\n- Changing Airflow triggerers' CPU, memory, or storage.\n- Changing Airflow web server's CPU, memory, or storage.\n- Increasing or decreasing the minimum number of workers.\n- Reducing the maximum number of Airflow workers. For example, if an environment currently runs two workers, and the maximum is reduced to three.\n- Changing maintenance windows.\n- Changing scheduled snapshots settings.\n- Changing environment labels.\n\nUpdating with Terraform\n-----------------------\n\n| **Warning:** If you attempt to change a configuration parameter that cannot be updated, Terraform **deletes your environment and creates a new one** with the new parameter value.\n\nRun `terraform plan` before `terraform apply` to see if Terraform creates a new\nenvironment instead of updating it.\n\nBefore you begin\n----------------\n\n- Check that your account, the service account of your environment, and\n the Cloud Composer Service Agent account in your project have\n required permissions:\n\n - Your account must have a role that\n [can trigger environment update operations](/composer/docs/composer-3/access-control#user-account).\n\n - The service account of your environment must have a role that\n [has enough permissions to perform update operations](/composer/docs/composer-3/access-control#service-account).\n\n- The `gcloud composer environments update` command terminates when the\n operation is finished. You can use the `--async` flag to avoid waiting for\n the operation to complete.\n\nUpdate environments\n-------------------\n\nFor more information about updating your environment, see other documentation\npages about specific update operations. For example:\n\n- [Override Airflow configuration options](/composer/docs/composer-3/override-airflow-configurations)\n- [Set environment variables](/composer/docs/composer-3/set-environment-variables)\n- [Install Python dependencies](/composer/docs/composer-3/install-python-dependencies)\n- [Scale environments](/composer/docs/composer-3/scale-environments)\n\nView environment details\n------------------------\n\n### Console\n\n1. In Google Cloud console, go to the **Environments** page.\n\n [Go to Environments](https://console.cloud.google.com/composer/environments)\n2. In the list of environments, click the name of your environment.\n The **Environment details** page opens.\n\n### gcloud\n\nRun the following `gcloud` command: \n\n gcloud composer environments describe \u003cvar translate=\"no\"\u003eENVIRONMENT_NAME\u003c/var\u003e \\\n --location \u003cvar translate=\"no\"\u003eLOCATION\u003c/var\u003e\n\nReplace:\n\n- `ENVIRONMENT_NAME` with the name of the environment.\n- `LOCATION` with the region where the environment is located.\n\n### API\n\nConstruct an [`environments.get`](/composer/docs/reference/rest/v1/projects.locations.environments/get) API request.\n\nExample: \n\n GET https://composer.googleapis.com/v1/projects/example-project/\n locations/us-central1/environments/example-environment\n\n### Terraform\n\nRun the `terraform state show` command for your environment's resource.\n\nThe name of your environment's Terraform resource might be different than the\nname of your environment. \n\n terraform state show google_composer_environment.\u003cvar translate=\"no\"\u003eRESOURCE_NAME\u003c/var\u003e\n\nReplace:\n\n- `RESOURCE_NAME` with the name of your environment's resource.\n\n### Rolling back update changes\n\nIn some rare situations, an update operation might be interrupted\n(for example, because of a timeout) and the requested changes might not be\nrolled back in all environment components (such as the Airflow web server).\n\nFor example, an update operation might be installing or removing additional\nPyPI modules, re-defining or defining a new Airflow or Cloud Composer\nenvironment variable, or changing some Airflow-related parameters.\n\nSuch a situation might occur if an update operation is triggered when other\noperations are in progress, for example Cloud Composer cluster's\nautoscaling or a maintenance operation.\n\nIn such a situation, it's recommended to repeat the operation.\n\n### Duration of update or upgrade operations\n\nThe duration of update and upgrade operations is affected by the following\nfactors:\n\n- Most update or upgrade operations require restarting Airflow components\n like Airflow schedulers, workers and web servers. After a component is\n restarted, it must be initialized. During the initialization, Airflow\n schedulers and workers download the contents of `/dags` and `/plugins`\n folders from the environment's bucket. The process of syncing files to\n Airflow schedulers and workers isn't instantaneous and depends on the total\n size and number of all objects in these folders.\n\n We recommend to keep only DAG and plugin files in `/dags` and `/plugins`\n folders (respectively) and remove all other files. Too much data\n in `/dags` and `/plugins` folders might slow down the initialization of\n Airflow components and in certain cases might make the initialization not\n possible.\n\n We recommend to keep less than 30 MB of data in `/dags` and `/plugins`\n folders, and to definitely not exceed 100 MB size of data. For more\n information, also see\n [Handling large number of DAGs and plugins](/composer/docs/composer-2/troubleshooting-dags#large-number-of-dags)\n- The size of the Airflow database might significantly increase the time of\n upgrade operations. We recommend to maintain the Airflow database size by\n\n configuring a [database retention policy](/composer/docs/composer-3/configure-db-retention).\n\nWhat's next\n-----------\n\n- [Upgrade environments](/composer/docs/composer-3/upgrade-environments)\n- [Override Airflow configuration options](/composer/docs/composer-3/override-airflow-configurations)"]]