Tetap teratur dengan koleksi
Simpan dan kategorikan konten berdasarkan preferensi Anda.
Dokumen ini berisi batas konten dan kuota kecepatan saat ini untuk Cloud Data Fusion API.
Kuota kapasitas
Kuota kecepatan Cloud Data Fusion API berikut berlaku untuk permintaan API bidang kontrol Cloud Data Fusion, seperti permintaan untuk membuat, memperbarui, atau menghapus instance Cloud Data Fusion. Kebijakan ini tidak berlaku untuk permintaan resource lain, seperti resource Compute Engine atau Cloud Storage, untuk menjalankan pipeline pemrosesan data pada instance Cloud Data Fusion (batas kuota lainnya mungkin berlaku untuk permintaan resource pipeline). Google Cloud
Kuota dapat berubah sewaktu-waktu.
Kuota
Nilai
Permintaan per menit dari satu pengguna di satu region
600
Batas ini berlaku untuk setiap project konsol Google Cloud dan dibagikan ke
semua aplikasi dan alamat IP yang menggunakan project tersebut.
Kuota Google Cloud lainnya
Instance Cloud Data Fusion Anda dapat menggunakan produk lain untuk menjalankan pipeline. Google CloudProduk-produk ini memiliki kuota level project, yang mencakup kuota yang berlaku pada penggunaan Cloud Data Fusion.
Dataproc
Cloud Data Fusion menggunakan Dataproc sebagai lingkungan
eksekusi untuk pipeline.
Kuota Dataproc
berlaku untuk semua pipeline yang Anda jalankan.
Cloud Logging
Cloud Data Fusion menyimpan log di
Logging.
Kuota Logging
berlaku untuk instance Cloud Data Fusion Anda.
Kuota tambahan
Bergantung pada Google Cloud layanan yang digunakan pipeline Anda,
kuota tambahan mungkin berlaku, misalnya:
BigQuery—Saat membaca atau menulis data ke BigQuery, kuota BigQuery akan berlaku.
Bigtable—Saat membaca atau menulis data ke Bigtable, kuota Bigtable akan berlaku.
Cloud Storage—Saat membaca atau menulis data ke Cloud Storage, kuota Cloud Storage akan berlaku.
Compute Engine—Saat membaca atau menulis data ke Compute Engine, kuota Compute Engine akan berlaku.
Spanner—Saat membaca atau menulis data ke Spanner, kuota Spanner akan berlaku.
[[["Mudah dipahami","easyToUnderstand","thumb-up"],["Memecahkan masalah saya","solvedMyProblem","thumb-up"],["Lainnya","otherUp","thumb-up"]],[["Sulit dipahami","hardToUnderstand","thumb-down"],["Informasi atau kode contoh salah","incorrectInformationOrSampleCode","thumb-down"],["Informasi/contoh yang saya butuhkan tidak ada","missingTheInformationSamplesINeed","thumb-down"],["Masalah terjemahan","translationIssue","thumb-down"],["Lainnya","otherDown","thumb-down"]],["Terakhir diperbarui pada 2025-08-12 UTC."],[[["\u003cp\u003eThis document outlines the request quotas for the Cloud Data Fusion API, specifically for control plane operations like creating or modifying instances.\u003c/p\u003e\n"],["\u003cp\u003eThe Cloud Data Fusion API has a quota of 600 requests per minute per user per region, which applies across all applications and IP addresses using a single project.\u003c/p\u003e\n"],["\u003cp\u003eCloud Data Fusion pipelines use other Google Cloud services, such as Dataproc and Logging, which each have their own separate project-level quotas.\u003c/p\u003e\n"],["\u003cp\u003eDepending on the services your pipelines interact with, additional quotas from BigQuery, Bigtable, Cloud Storage, Compute Engine, or Spanner may also apply.\u003c/p\u003e\n"],["\u003cp\u003eThe request quotas apply specifically to the Cloud Data Fusion API and not other Google Cloud resources that are used for pipeline resources.\u003c/p\u003e\n"]]],[],null,["# Quotas and limits\n\nThis document contains current content limits and rate quotas for the\nCloud Data Fusion API.\n\nRate quotas\n-----------\n\nThe following [Cloud Data Fusion API](/data-fusion/docs/reference/rest)\nrate quotas apply to Cloud Data Fusion control plane API requests,\nsuch as requests to create, update, or delete Cloud Data Fusion\ninstances. They do not apply to requests for other Google Cloud\nresources, such as Compute Engine or Cloud Storage\nresources, to run data processing pipelines on Cloud Data Fusion\ninstances (other [quota limits](https://console.cloud.google.com/iam-admin/quotas) may apply to\nrequests for pipeline resources).\n\nQuotas are subject to change.\n\nThese limits apply to each Google Cloud console project and are shared across\nall applications and IP addresses using that project.\n\nOther Google Cloud quotas\n-------------------------\n\nYour Cloud Data Fusion instances can use other Google Cloud\nproducts to run pipelines. These products have project-level quotas, which\ninclude quotas that apply to Cloud Data Fusion use.\n\n### Dataproc\n\nCloud Data Fusion uses Dataproc as the execution\nenvironment for pipelines.\n[Dataproc quota](/dataproc/quotas \"Cloud Dataproc quota\")\napplies to all pipelines that you execute.\n\n### Cloud Logging\n\nCloud Data Fusion saves logs in\n[Logging](/logging \"Stackdriver Logging\"). The\n[Logging quota](/logging/quota-policy \"Stackdriver Logging quota\")\napplies to your Cloud Data Fusion instances.\n\n### Additional quotas\n\nDepending on the Google Cloud services your pipelines use,\nadditional quotas might apply, for example:\n\n- BigQuery---When reading or writing data into BigQuery, the [BigQuery quota](/bigquery/quota-policy) applies.\n- Bigtable---When reading or writing data into Bigtable, the [Bigtable quota](/bigtable/quotas) applies.\n- Cloud Storage---When reading or writing data into Cloud Storage, the [Cloud Storage quota](/storage/quotas) applies.\n- Compute Engine---When reading or writing data into Compute Engine, the [Compute Engine quota](/compute/quotas) applies.\n- Spanner---When reading or writing data into Spanner, the [Spanner quota](/spanner/quotas) applies."]]