Untuk mengetahui daftar lengkap batasan yang berlaku untuk tabel
BigLake berdasarkan Amazon S3 dan Blob Storage, lihat Batasan.
Sebelum memulai
Pastikan Anda memiliki resource berikut:
Koneksi untuk mengakses Blob Storage Anda.
Dalam suatu koneksi, Anda harus membuat kebijakan untuk jalur container Blob Storage yang menjadi tujuan ekspor. Kemudian, dalam kebijakan tersebut,
buat peran yang memiliki
izin Microsoft.Storage/storageAccounts/blobServices/containers/write.
BigQuery Omni menulis ke lokasi Blob Storage yang ditentukan, terlepas dari konten apa pun
yang ada. Kueri ekspor dapat menimpa data yang ada atau mencampur hasil kueri dengan data yang sudah ada. Sebaiknya Anda mengekspor hasil kueri ke container Blob Storage yang kosong.
CONNECTION_NAME: nama koneksi yang Anda
buat dengan izin yang diperlukan untuk menulis ke container.
AZURE_STORAGE_ACCOUNT_NAME: nama akun
Blob Storage tempat Anda ingin menulis hasil kueri.
CONTAINER_NAME: nama container tempat Anda ingin menulis hasil kueri.
FILE_PATH: jalur tempat Anda ingin menulis
file yang diekspor. String ini harus berisi tepat satu * karakter pengganti di mana saja di
direktori tanpa turunan dari string jalur, misalnya, ../aa/*,
../aa/b*c, ../aa/*bc, dan ../aa/bc*. BigQuery mengganti * dengan 0000..N, bergantung pada jumlah file yang diekspor.
BigQuery menentukan jumlah dan ukuran file. Jika BigQuery memutuskan untuk mengekspor dua file, * pada nama file dari file pertama akan diganti dengan 000000000000, dan * dalam nama file kedua akan diganti dengan 000000000001.
FORMAT: format yang didukung adalah JSON, AVRO,
CSV, dan PARQUET.
QUERY: kueri untuk menganalisis data yang
disimpan dalam tabel BigLake.
Pemecahan masalah
Jika Anda mendapatkan error yang terkait dengan quota failure, periksa apakah Anda telah memesan
kapasitas untuk kueri Anda. Untuk mengetahui informasi selengkapnya tentang pemesanan slot, lihat
Sebelum memulai dalam dokumen ini.
[[["Mudah dipahami","easyToUnderstand","thumb-up"],["Memecahkan masalah saya","solvedMyProblem","thumb-up"],["Lainnya","otherUp","thumb-up"]],[["Sulit dipahami","hardToUnderstand","thumb-down"],["Informasi atau kode contoh salah","incorrectInformationOrSampleCode","thumb-down"],["Informasi/contoh yang saya butuhkan tidak ada","missingTheInformationSamplesINeed","thumb-down"],["Masalah terjemahan","translationIssue","thumb-down"],["Lainnya","otherDown","thumb-down"]],["Terakhir diperbarui pada 2025-08-17 UTC."],[[["\u003cp\u003eThis guide outlines the process of exporting query results from a BigLake table to Azure Blob Storage.\u003c/p\u003e\n"],["\u003cp\u003eBefore exporting, you must establish a connection to your Blob Storage with appropriate write permissions and have a BigLake table in place.\u003c/p\u003e\n"],["\u003cp\u003eThe export process involves using a specific GoogleSQL \u003ccode\u003eEXPORT DATA\u003c/code\u003e query with connection details, target URI, desired format, and the source query.\u003c/p\u003e\n"],["\u003cp\u003eIt is recommended to export query results to an empty Blob Storage container, because the export query can overwrite or mix with existing data.\u003c/p\u003e\n"],["\u003cp\u003eIf you are receiving \u003ccode\u003equota failure\u003c/code\u003e error, check if you have reserved capacity for your queries.\u003c/p\u003e\n"]]],[],null,["# Export query results to Blob Storage\n====================================\n\nThis document describes how to export the result of a query that runs against a\n[BigLake table](/bigquery/docs/biglake-intro) to your\nAzure Blob Storage.\n\nFor information about how data flows between BigQuery and\nAzure Blob Storage,\nsee [Data flow when exporting data](/bigquery/docs/omni-introduction#export-data).\n\nLimitations\n-----------\n\nFor a full list of limitations that apply to BigLake tables\nbased on Amazon S3 and Blob Storage, see [Limitations](/bigquery/docs/omni-introduction#limitations).\n\nBefore you begin\n----------------\n\nEnsure that you have the following resources:\n\n\n- A [connection to access your Blob Storage](/bigquery/docs/omni-azure-create-connection). Within the connection, you must create a policy for the Blob Storage container path that you want to export to. Then, within that policy, create a role that has the `Microsoft.Storage/storageAccounts/blobServices/containers/write` permission.\n- An [Blob Storage BigLake table](/bigquery/docs/omni-azure-create-external-table).\n\n\u003c!-- --\u003e\n\n- If you are on the [capacity-based pricing model](/bigquery/pricing#capacity_compute_analysis_pricing), then ensure that you have enabled the [BigQuery Reservation API](https://console.cloud.google.com/apis/library/bigqueryreservation.googleapis.com) for your project. For information about pricing, see [BigQuery Omni pricing](/bigquery/pricing#bqomni).\n\nExport query results\n--------------------\n\nBigQuery Omni writes to the specified Blob Storage location regardless of any existing\ncontent. The export query can overwrite existing data or mix the query result\nwith existing data. We recommend that you export the query result to an empty\nBlob Storage container.\n\n1. In the Google Cloud console, go to the **BigQuery** page.\n\n [Go to BigQuery](https://console.cloud.google.com/bigquery)\n2. In the **Query editor** field, enter a GoogleSQL export query:\n\n ```bash\n EXPORT DATA WITH CONNECTION \\`CONNECTION_REGION.CONNECTION_NAME\\`\n OPTIONS(\n uri=\"azure://\u003cvar translate=\"no\"\u003eAZURE_STORAGE_ACCOUNT_NAME\u003c/var\u003e.blob.core.windows.net/\u003cvar translate=\"no\"\u003eCONTAINER_NAME\u003c/var\u003e/\u003cvar translate=\"no\"\u003eFILE_PATH\u003c/var\u003e/*\",\n format=\"\u003cvar translate=\"no\"\u003eFORMAT\u003c/var\u003e\"\n )\n AS QUERY\n ```\n\n Replace the following:\n - \u003cvar translate=\"no\"\u003eCONNECTION_REGION\u003c/var\u003e: the region where the connection was created.\n - \u003cvar translate=\"no\"\u003eCONNECTION_NAME\u003c/var\u003e: the connection name that you created with the necessary permission to write to the container.\n - \u003cvar translate=\"no\"\u003eAZURE_STORAGE_ACCOUNT_NAME\u003c/var\u003e: the name of the Blob Storage account to which you want to write the query result.\n - \u003cvar translate=\"no\"\u003eCONTAINER_NAME\u003c/var\u003e: the name of the container to which you want to write the query result.\n - \u003cvar translate=\"no\"\u003eFILE_PATH\u003c/var\u003e: the path where you want to write the exported file to. It must contain exactly one wildcard `*` anywhere in the leaf directory of the path string, for example, `../aa/*`, `../aa/b*c`, `../aa/*bc`, and `../aa/bc*`. BigQuery replaces `*` with `0000..N` depending on the number of files exported. BigQuery determines the file count and sizes. If BigQuery decides to export two files, then `*` in the first file's filename is replaced by `000000000000`, and `*` in the second file's filename is replaced by `000000000001`.\n - \u003cvar translate=\"no\"\u003eFORMAT\u003c/var\u003e: supported formats are `JSON`, `AVRO`, `CSV`, and `PARQUET`.\n - \u003cvar translate=\"no\"\u003eQUERY\u003c/var\u003e: the query to analyze the data that is stored in a BigLake table.\n\n| **Note:** To override the default project, use the `--project_id=`\u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e parameter. Replace \u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e with the ID of your Google Cloud project.\n\nTroubleshooting\n---------------\n\nIf you get an error related to `quota failure`, then check if you have reserved\ncapacity for your queries. For more information about slot reservations, see\n[Before you begin](#before_you_begin) in this document.\n\nWhat's next\n-----------\n\n- Learn about [BigQuery Omni](/bigquery/docs/omni-introduction).\n- Learn how to [export table data](/bigquery/docs/exporting-data).\n- Learn how to [query data stored in Blob Storage](/bigquery/docs/query-azure-data).\n- Learn how to [set up VPC Service Controls for BigQuery Omni](/bigquery/docs/omni-vpc-sc)."]]