[[["易于理解","easyToUnderstand","thumb-up"],["解决了我的问题","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["很难理解","hardToUnderstand","thumb-down"],["信息或示例代码不正确","incorrectInformationOrSampleCode","thumb-down"],["没有我需要的信息/示例","missingTheInformationSamplesINeed","thumb-down"],["翻译问题","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["最后更新时间 (UTC):2025-08-27。"],[[["\u003cp\u003eDataproc Metastore allows you to export metadata, saving it in either Avro files (for Hive 2.3.6 and 3.1.2) or a MySQL dump file (for all Hive versions) within a specified Cloud Storage location.\u003c/p\u003e\n"],["\u003cp\u003eExporting metadata requires specific IAM roles, including Dataproc Metastore Editor, Administrator, or Metadata Operator, and the Storage Object Creator role for Cloud Storage access.\u003c/p\u003e\n"],["\u003cp\u003eWhile a metadata export is in progress, the Dataproc Metastore service cannot be updated, but it remains usable for normal operations like accessing metadata from attached clusters.\u003c/p\u003e\n"],["\u003cp\u003eMetadata exports can be initiated via the Google Cloud console, gcloud CLI, or REST API, with the service returning to an active state upon completion, regardless of success.\u003c/p\u003e\n"],["\u003cp\u003eThe metadata export feature only exports metadata, not the data within Apache Hive's internal tables, and the export history, viewable in the console, is deleted when the Dataproc Metastore service is deleted.\u003c/p\u003e\n"]]],[],null,["# Export metadata from Dataproc Metastore\n\nThis page explains how to export metadata from\nDataproc Metastore.\n\nThe export metadata feature lets you save your metadata in a portable storage\nformat.\n\nAfter you export your data, you can then [import the\nmetadata](/dataproc-metastore/docs/import-metadata) in to another\nDataproc Metastore service or a self-managed Hive Metastore\n(HMS).\n\nAbout exporting metadata\n------------------------\n\nWhen you export metadata from Dataproc Metastore, the service\nstores the data in one of the following file formats:\n\n- A set of Avro files stored in a folder.\n- A single MySQL dump file stored in a Cloud Storage folder.\n\n### Avro\n\nAvro based exports are only supported for Hive versions 2.3.6 and 3.1.2. When\nyou export Avro files, Dataproc Metastore creates a\n`\u003ctable-name\u003e.avro` file for each table in your database.\n\nTo export Avro files, your Dataproc Metastore service can use\nthe MySQL or Spanner database type.\n\n### MySQL\n\nMySQL based exports are supported for all versions of Hive. When you export\nMySQL files, Dataproc Metastore creates a single SQL file that\ncontains all your table information.\n\nTo export MySQL files, your Dataproc Metastore service must use\nthe MySQL database type. The Spanner database type doesn't support MySQL\nimports.\n\nBefore you begin\n----------------\n\n- [Enable Dataproc Metastore](/dataproc-metastore/docs/enable-service) in your project.\n- [Understand networking requirements](/dataproc-metastore/docs/access-service) specific to your project.\n- [Create a Dataproc Metastore service](/dataproc-metastore/docs/create-service).\n\n### Required roles\n\n\nTo get the permissions that\nyou need to export metadata into Dataproc Metastore,\n\nask your administrator to grant you the\nfollowing IAM roles:\n\n- To export metadata, either:\n - [Dataproc Metastore Editor](/iam/docs/roles-permissions/metastore#metastore.editor) (`roles/metastore.editor`) on the Dataproc Metastore service\n - [Dataproc Metastore Administrator](/iam/docs/roles-permissions/metastore#metastore.admin) (`roles/metastore.admin`) on the Dataproc Metastore service\n - [Dataproc Metastore Metadata Operator](/iam/docs/roles-permissions/metastore#metastore.metadataOperator) (`roles/metastore.metadataOperator`) on the Dataproc Metastore service\n- For MySQL and Avro, to use the Cloud Storage object for export: [grant your user account and the Dataproc Metastore service agent the Storage Creator role](/iam/docs/roles-permissions/storage#storage.objectCreator) (`roles/storage.objectCreator`) on the Cloud Storage bucket\n\n\nFor more information about granting roles, see [Manage access to projects, folders, and organizations](/iam/docs/granting-changing-revoking-access).\n\n\nThese predefined roles contain\n\nthe permissions required to export metadata into Dataproc Metastore. To see the exact permissions that are\nrequired, expand the **Required permissions** section:\n\n\n#### Required permissions\n\nThe following permissions are required to export metadata into Dataproc Metastore:\n\n- To export metadata: ` metastore.services.export` on the metastore service\n- For MySQL and Avro, to use the Cloud Storage object for export, grant your user account and the Dataproc Metastore service agent: ` storage.objects.create` on the Cloud Storage bucket\n\n\nYou might also be able to get\nthese permissions\nwith [custom roles](/iam/docs/creating-custom-roles) or\nother [predefined roles](/iam/docs/roles-overview#predefined).\nFor more information about specific Dataproc Metastore roles and permissions, see [Dataproc Metastore IAM overview](/dataproc-metastore/docs/iam-and-access-control).\n\n\u003cbr /\u003e\n\nExport metadata\n---------------\n\nBefore exporting your metadata, note the following considerations:\n\n- While an export is running, you can't update a Dataproc Metastore service --- for example changing configuration settings. However, you can still use it for normal operations, such as using it to access its metadata from attached Dataproc or self-managed clusters.\n- The metadata export feature only exports metadata. Data that's created by Apache Hive in internal tables isn't replicated in the export.\n\nTo export metadata from a Dataproc Metastore service, perform the\nfollowing steps. \n\n### Console\n\n1. In the Google Cloud console, open the **Dataproc Metastore** page:\n\n [Open Dataproc Metastore](https://console.cloud.google.com/dataproc/metastore/services)\n2. On the **Dataproc Metastore** page, click the name of the service\n you want to export metadata from.\n\n The **Service detail** page opens.\n Dataproc Metastore Service detail page\n3. In the navigation bar, click **Export**.\n\n The **Export metadata** page opens.\n4. In the **Destination** section, choose either **MySQL** or **Avro**.\n\n5. In the **Destination URI** field, click **Browse** and select the\n **Cloud Storage URI** where you want to export your files to.\n\n You can also enter your bucket location in the provided text field. Use\n the following format: `bucket/object` or `bucket/folder/object`.\n | **Note:** To store your exported metadata, the export job creates a new sub folder in your **Cloud Storage URI** destination.\n6. To start the export, click **Submit**.\n\n When finished, your export appears in a table on the **Service detail**\n page on the **Import/Export** tab.\n\n When the export completes, Dataproc Metastore automatically\n returns to the [active state](/dataproc-metastore/docs/reference/rest/v1/projects.locations.services#state),\n regardless of whether or not the export succeeded.\n\n### gcloud CLI\n\n1. To export metadata from a service, run the following [`gcloud metastore services export gcs`](/sdk/gcloud/reference/metastore/services/export/gcs) command:\n\n ```\n gcloud metastore services export gcs SERVICE \\\n --location=LOCATION \\\n --destination-folder=gs://bucket-name/path/to/folder \\\n --dump-type=DUMP_TYPE\n ```\n\n Replace the following:\n - \u003cvar translate=\"no\"\u003eSERVICE\u003c/var\u003e: the name of your Dataproc Metastore service.\n - \u003cvar translate=\"no\"\u003eLOCATION\u003c/var\u003e: the Google Cloud region in which your Dataproc Metastore service resides.\n - \u003cvar translate=\"no\"\u003ebucket-name/path/to/folder\u003c/var\u003e: the Cloud Storage destination folder where you want to store your export.\n - \u003cvar translate=\"no\"\u003eDUMP_TYPE\u003c/var\u003e: the type of database dump to be generated by the export. Accepted values include `mysql` and `avro`. The default value is `mysql`.\n2. Verify that the export was successful.\n\n When the export completes, Dataproc Metastore automatically\n returns to the [active state](/dataproc-metastore/docs/reference/rest/v1/projects.locations.services#state),\n regardless of whether or not the export succeeded.\n\n### REST\n\nFollow the API instructions to [export metadata into a service](/dataproc-metastore/docs/reference/rest/v1/projects.locations.services/exportMetadata) by using the APIs Explorer.\n\nWhen the export completes, the service automatically returns to the [active state](/dataproc-metastore/docs/reference/rest/v1/projects.locations.services#state),\nregardless of whether or not it succeeded.\n\n### View export history\n\nTo view the export history of a Dataproc Metastore service in the\nGoogle Cloud console, complete the following steps:\n\n1. In the Google Cloud console, open the [**Dataproc Metastore**](https://console.cloud.google.com/dataproc/metastore/services) page.\n2. In the navigation bar, click **Import/Export**.\n\n Your export history appear in the **Export history** table.\n\n The history displays up to the last 25 exports.\n\nDeleting a Dataproc Metastore service also deletes all associated\nexport history.\n\nTroubleshoot common issues\n--------------------------\n\nSome common issues include the following:\n\n- [The service agent or user account doesn't have necessary permissions](/dataproc-metastore/docs/troubleshooting#service-agent-or-user-permissions).\n- [Job fails because the database file is too large](/dataproc-metastore/docs/troubleshooting#job-failed-database-file-size).\n\nFor more help solving common troubleshooting issues, see [Import and export error scenarios](/dataproc-metastore/docs/troubleshooting#import-and-export-error-scenarios).\n\nWhat's next\n-----------\n\n- [Import metadata into a service](/dataproc-metastore/docs/import-metadata)\n- [Update and delete a service](/dataproc-metastore/docs/manage-service)\n- [Hive Metastore](/dataproc-metastore/docs/hive-metastore)"]]