Mengelola upgrade versi untuk instance dan pipeline
Tetap teratur dengan koleksi
Simpan dan kategorikan konten berdasarkan preferensi Anda.
Halaman ini menjelaskan cara mengupgrade versi instance atau pipeline batch.
Upgrade instance Cloud Data Fusion dan pipeline batch Anda ke versi platform dan plugin terbaru untuk mendapatkan fitur terbaru, perbaikan bug, dan peningkatan performa.
Sebelum memulai
Rencanakan periode nonaktif terjadwal untuk upgrade.
Proses ini memerlukan waktu hingga satu jam.
In the Google Cloud console, activate Cloud Shell.
Setelah membuat instance Cloud Data Fusion, Anda tidak dapat mengubah
edisinya, bahkan melalui operasi upgrade.
Jangan memicu upgrade dengan Terraform, karena akan menghapus dan membuat ulang instance, bukan melakukan upgrade di tempat. Masalah ini mengakibatkan hilangnya data yang ada dalam instance.
Cloud Data Fusion tidak memulai ulang pipeline yang berhenti sebagai akibat dari
operasi upgrade.
Saat mengupgrade instance dari versi sebelum 6.11.0, Anda harus memperkirakan waktu
nonaktif yang lebih lama untuk upgrade, terutama jika instance menangani banyak data.
Upgrade pipeline real-time tidak didukung, kecuali di pipeline yang dibuat
di versi 6.8.0 dengan sumber real-time Kafka. Untuk mengetahui solusinya, lihat
Mengupgrade pipeline real-time.
Mengupgrade instance Cloud Data Fusion
Untuk mengupgrade instance Cloud Data Fusion ke versi Cloud Data Fusion baru, buka halaman Detail instance:
Di konsol Google Cloud , buka halaman Cloud Data Fusion.
Klik Instance, lalu klik nama instance untuk membuka halaman
Instance details.
Direkomendasikan: Cadangkan semua pipeline. Anda dapat mencadangkan pipeline dengan salah satu dari
dua cara:
Download file ZIP dengan mengikuti langkah-langkah berikut:
Untuk memicu download file zip, cadangkan semua pipeline dengan
perintah berikut:
echo$CDAP_ENDPOINT/v3/export/apps
Salin output URL ke browser Anda.
Ekstrak file yang didownload, lalu konfirmasi bahwa semua pipeline telah diekspor. Pipeline diatur berdasarkan namespace.
Mencadangkan pipeline menggunakan Source Control Management
(SCM), yang tersedia di versi 6.9 dan yang lebih baru. SCM menyediakan integrasi GitHub, yang dapat Anda gunakan untuk mencadangkan pipeline.
Upgrade pipeline dengan mengikuti langkah-langkah berikut:
Buat variabel yang mengarah ke file pipeline_upgrade.json yang akan Anda buat di langkah berikutnya untuk menyimpan daftar pipeline.
export PIPELINE_LIST=PATH/pipeline_upgrade.json
Ganti PATH dengan jalur ke file.
Buat daftar semua pipeline untuk instance dan namespace menggunakan perintah berikut. Hasilnya disimpan dalam file $PIPELINE_LIST
dalam format JSON. Anda dapat mengedit daftar untuk menghapus pipeline yang tidak
memerlukan upgrade.
Ganti NAMESPACE_ID dengan namespace tempat Anda ingin
upgrade terjadi.
Upgrade pipeline yang tercantum di pipeline_upgrade.json.
Masukkan NAMESPACE_ID pipeline yang akan diupgrade.
Perintah ini menampilkan daftar pipeline yang diupgrade beserta status
upgradenya.
Ganti NAMESPACE_ID dengan ID namespace pipeline yang sedang diupgrade.
Agar pipeline tidak macet saat Anda menjalankannya di versi baru, berikan peran yang diperlukan di instance yang diupgrade.
Mengupgrade pipeline real-time
Upgrade pipeline real-time tidak didukung, kecuali di pipeline yang dibuat di
versi 6.8.0 dengan sumber real-time Kafka.
Untuk lainnya, Anda harus melakukan hal berikut:
Hentikan dan ekspor pipeline.
Upgrade instance.
Impor pipeline real-time ke instance yang diupgrade.
Upgrade untuk mengaktifkan Replikasi
Replikasi dapat diaktifkan di lingkungan Cloud Data Fusion dalam versi 6.3.0 atau yang lebih baru. Jika Anda memiliki versi 6.2.3, upgrade ke
6.3.0, lalu upgrade ke versi terbaru. Kemudian, Anda dapat mengaktifkan Replikasi.
[[["Mudah dipahami","easyToUnderstand","thumb-up"],["Memecahkan masalah saya","solvedMyProblem","thumb-up"],["Lainnya","otherUp","thumb-up"]],[["Sulit dipahami","hardToUnderstand","thumb-down"],["Informasi atau kode contoh salah","incorrectInformationOrSampleCode","thumb-down"],["Informasi/contoh yang saya butuhkan tidak ada","missingTheInformationSamplesINeed","thumb-down"],["Masalah terjemahan","translationIssue","thumb-down"],["Lainnya","otherDown","thumb-down"]],["Terakhir diperbarui pada 2025-09-04 UTC."],[[["\u003cp\u003eThis guide explains how to upgrade Cloud Data Fusion instances and batch pipelines to the latest versions for improved features, bug fixes, and performance.\u003c/p\u003e\n"],["\u003cp\u003eBefore initiating an upgrade, it is mandatory to stop all running pipelines, suspend scheduled pipelines, and disable upstream triggers to avoid unpredictable issues and ensure instance availability.\u003c/p\u003e\n"],["\u003cp\u003eUpgrading a Cloud Data Fusion instance involves selecting a new version through the Google Cloud console or gcloud CLI, followed by verification of the successful upgrade within the instance's web interface.\u003c/p\u003e\n"],["\u003cp\u003eBatch pipeline upgrades require backing up pipelines, either by downloading a zip file or using Source Control Management, followed by using a specific command-line process to upgrade the pipelines listed in a JSON file.\u003c/p\u003e\n"],["\u003cp\u003eUpgrading real-time pipelines is generally not supported, but for real-time pipelines that were created with Kafka in version 6.8.0 it is, and a workaround for other real-time pipelines is to export the pipeline, upgrade the instance, then import the real-time pipeline into the new instance.\u003c/p\u003e\n"]]],[],null,["# Manage version upgrades for instances and pipelines\n\nThis page describes upgrading the version of your instances or batch\npipelines.\n\nUpgrade your Cloud Data Fusion instances and batch pipelines to the latest\nplatform and plugin versions for the latest features, bug fixes, and performance\nimprovements.\n\nBefore you begin\n----------------\n\n| **Caution:** Before you upgrade, stop all running pipelines, suspend all pipeline schedules and disable all upstream triggers, such as Cloud Composer triggers. Upgrading an instance that has running pipelines can have unpredictable results and affect instance availability.\n\n- **Plan a scheduled downtime for the upgrade.** The process takes up to an hour.\n- In the Google Cloud console, activate Cloud Shell.\n\n [Activate Cloud Shell](https://console.cloud.google.com/?cloudshell=true)\n\n\u003cbr /\u003e\n\nLimitations\n-----------\n\n- After you create a Cloud Data Fusion instance, you cannot change its\n edition, even through an upgrade operation.\n\n- Don't trigger an upgrade with Terraform, as it deletes and recreates the\n instance, instead of performing an in-place upgrade. This issue results\n in the loss of any existing data within the instance.\n\n- Cloud Data Fusion doesn't restart pipelines that stop as a result of\n the upgrade operation.\n\n- When you upgrade an instance from versions prior to 6.11.0, expect greater\n downtime for the upgrade, especially if the instance handles a lot of data.\n\n- Upgrading real-time pipelines isn't supported, except in pipelines created\n in version 6.8.0 with a Kafka real-time source. For a workaround, see\n [Upgrade real-time pipelines](#upgrade-real-time-pipelines).\n\nUpgrade Cloud Data Fusion instances\n-----------------------------------\n\nTo upgrade a Cloud Data Fusion instance to a new Cloud Data Fusion\nversion, go to the **Instance details** page:\n\n1. In the Google Cloud console, go to the Cloud Data Fusion page.\n\n2. Click **Instances** , and then click the instance's name to go to the\n **Instance details** page.\n\n [Go to Instances](https://console.cloud.google.com/data-fusion/locations/-/instances)\n\nThen perform the upgrade using either the Google Cloud console or\ngcloud CLI: \n\n### Console\n\n1. Click **Upgrade** for a list of available versions.\n\n2. Select a version.\n\n3. Click **Upgrade**.\n\n4. Verify that the upgrade was successful:\n\n 1. Refresh the **Instance details** page.\n\n 2. Click **View instance** to access the upgraded instance in the\n Cloud Data Fusion web interface.\n\n 3. Click **System admin** in the menu bar.\n\n The new version number appears at the top of the page.\n5. To prevent your pipelines from getting stuck when you run them in the\n new version, [grant the required roles](#grant-roles) in your upgraded\n instance.\n\n### gcloud\n\n1. To upgrade to a new Cloud Data Fusion version, run the following\n gcloud CLI command from a local terminal\n [Cloud Shell](https://console.cloud.google.com/?cloudshell=true) session:\n\n gcloud beta data-fusion instances update \u003cvar translate=\"no\"\u003eINSTANCE_ID\u003c/var\u003e \\\n --project=\u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e \\\n --location=\u003cvar translate=\"no\"\u003eLOCATION_NAME\u003c/var\u003e \\\n --version=\u003cvar translate=\"no\"\u003eAVAILABLE_INSTANCE_VERSION\u003c/var\u003e\n\n - Optional: If applicable for your instance, add the\n [`--enable_stackdriver_logging`](/sdk/gcloud/reference/beta/data-fusion/instances/update#--enable_stackdriver_logging),\n [`--enable_stackdriver_monitoring`](/sdk/gcloud/reference/beta/data-fusion/instances/update#--enable_stackdriver_monitoring), and\n [`--labels`](/sdk/gcloud/reference/beta/data-fusion/instances/update#--labels) flags.\n\n - Optional: You can pass the CDAP properties, such as\n `enable.unrecoverable.reset`, as\n [`--options`](/sdk/gcloud/reference/beta/data-fusion/instances/update#--options).\n\n2. Verify that the upgrade was successful by following these steps:\n\n 1. In the Google Cloud console, go to the Cloud Data Fusion\n **Instances** page.\n\n 2. Click **View instance** to access the upgraded instance in the\n Cloud Data Fusion web interface.\n\n 3. Click **System Admin** in the menu bar.\n\n The new version number appears at the top of the page.\n3. To prevent your pipelines from getting stuck when you run them in the\n new version, [grant the required roles](#grant-roles) in your upgraded\n instance.\n\nUpgrade batch pipelines\n-----------------------\n\nTo upgrade your Cloud Data Fusion batch pipelines to use the latest\nplugin versions:\n\n1. [Set environment variables](/data-fusion/docs/reference/cdap-reference#set-up).\n\n2. **Recommended:** Back up all pipelines. You can back up pipelines in one of\n two ways:\n\n - Download the zip file by following these steps:\n\n 1. To trigger a zip file download, back up all pipelines with the following command:\n\n echo $CDAP_ENDPOINT/v3/export/apps\n\n 1. Copy the URL output to your browser.\n 2. Extract the downloaded file, then confirm that all pipelines were exported. The pipelines are organized by namespace.\n - Back up pipelines using [Source Control Management](/data-fusion/docs/how-to/source-control-management)\n (SCM), available in version 6.9 and later. SCM provides GitHub\n integration, which you can use to back up pipelines.\n\n3. Upgrade pipelines by following these steps:\n\n 1. Create a variable that points to the `pipeline_upgrade.json` file that\n you will create in the next step to save a list of pipelines.\n\n export PIPELINE_LIST=\u003cvar translate=\"no\"\u003ePATH\u003c/var\u003e/pipeline_upgrade.json\n\n Replace \u003cvar translate=\"no\"\u003ePATH\u003c/var\u003e with the path to the file.\n 2. Create a list of all pipelines for an instance and namespace using\n the following command. The result is stored in the `$PIPELINE_LIST` file\n in `JSON` format. You can edit the list to remove pipelines that don't\n need upgrades.\n\n curl -H \"Authorization: Bearer $(gcloud auth print-access-token)\" -H \"Content-Type: application/json\" ${CDAP_ENDPOINT}/v3/namespaces/\u003cvar translate=\"no\"\u003eNAMESPACE_ID\u003c/var\u003e/apps -o $PIPELINE_LIST\n\n Replace \u003cvar translate=\"no\"\u003eNAMESPACE_ID\u003c/var\u003e with the namespace where you want the\n upgrade to happen.\n 3. Upgrade the pipelines listed in `pipeline_upgrade.json`.\n Insert the \u003cvar translate=\"no\"\u003eNAMESPACE_ID\u003c/var\u003e of pipelines to be upgraded.\n The command displays a list of upgraded pipelines with their upgrade\n status.\n\n curl -N -H \"Authorization: Bearer $(gcloud auth print-access-token)\" -H \"Content-Type: application/json\" ${CDAP_ENDPOINT}/v3/namespaces/\u003cvar translate=\"no\"\u003eNAMESPACE_ID\u003c/var\u003e/upgrade --data @$PIPELINE_LIST\n\n Replace \u003cvar translate=\"no\"\u003eNAMESPACE_ID\u003c/var\u003e with the namespace ID of the pipelines\n that are getting upgraded.\n4. To prevent your pipelines from getting stuck when you run them in the new\n version, [grant the required roles](#grant-roles) in your upgraded instance.\n\nUpgrade real-time pipelines\n---------------------------\n\nUpgrading real-time pipelines is not supported, except in pipelines created in\nversion 6.8.0 with a Kafka real-time source.\n\nFor everything else, you instead do the following:\n\n1. Stop and export the pipelines.\n2. Upgrade the instance.\n3. Import the real-time pipelines into your upgraded instance.\n\nUpgrade to enable Replication\n-----------------------------\n\nReplication can be enabled in Cloud Data Fusion\nenvironments in version 6.3.0 or later. If you have version 6.2.3, upgrade to\n6.3.0, then upgrade to the latest version. You can then [enable Replication](/data-fusion/docs/how-to/enable-replication).\n\nGrant roles for upgraded instances\n----------------------------------\n\nAfter the upgrade completes, grant the\n[Cloud Data Fusion Runner role](/data-fusion/docs/how-to/granting-service-account-permission#runtime-permission)\n(`roles/datafusion.runner`) and\n[Cloud Storage Admin role](/data-fusion/docs/how-to/granting-service-account-permission#admin-permission)\n(`roles/storage.admin`) to the Dataproc service account in your\n[project](/data-fusion/docs/concepts/security#projects).\n\nWhat's next\n-----------\n\n- [Manage patch revisions](/data-fusion/docs/how-to/upgrade-to-patch-revision) for Cloud Data Fusion instances.\n- Learn about [versioning in Cloud Data Fusion](/data-fusion/docs/concepts/versioning).\n- Refer to the [available version and patch revision upgrades](/data-fusion/docs/concepts/available-upgrades).\n- [Troubleshoot upgrades](/data-fusion/docs/support/troubleshoot-upgrades)."]]