Anda harus memiliki peran IAM dasar Pemilik (roles/owner) atau Editor
(roles/editor) di Google Cloud project
yang Anda gunakan, atau Anda harus memiliki peran IAM bawaan berikut:
Admin AlloyDB (roles/alloydb.admin) atau Pelihat AlloyDB
(roles/alloydb.viewer)
Untuk menyiapkan host klien guna melakukan operasi impor, Anda membuat VM Compute Engine yang dapat terhubung ke instance utama AlloyDB tempat database Anda berada, dan menginstal alat pg_restore serta Google Cloud CLI di VM tersebut.
Ikuti petunjuk di
Menghubungkan klien psql ke instance
untuk membuat VM Compute Engine dengan konektivitas yang tepat dan alat
pg_restore yang diinstal. Saat mengikuti petunjuk ini, pastikan untuk mengalokasikan penyimpanan lokal yang cukup ke VM Compute Engine untuk mengakomodasi file DMP yang Anda impor.
Instal gcloud CLI untuk menyediakan
akses command line ke file DMP di bucket Cloud Storage.
Mengimpor file DMP
Untuk mengimpor file DMP, Anda mendapatkan alamat IP instance primer AlloyDB tempat database Anda berada, lalu menggunakan alat pg_restore untuk mengimpor file ke dalam database.
Dapatkan alamat IP instance utama AlloyDB tempat database Anda berada dengan melihat detailnya.
Gunakan SSH untuk terhubung ke VM Compute Engine.
Konsol
Di konsol Google Cloud , buka halaman VM instances.
[[["Mudah dipahami","easyToUnderstand","thumb-up"],["Memecahkan masalah saya","solvedMyProblem","thumb-up"],["Lainnya","otherUp","thumb-up"]],[["Sulit dipahami","hardToUnderstand","thumb-down"],["Informasi atau kode contoh salah","incorrectInformationOrSampleCode","thumb-down"],["Informasi/contoh yang saya butuhkan tidak ada","missingTheInformationSamplesINeed","thumb-down"],["Masalah terjemahan","translationIssue","thumb-down"],["Lainnya","otherDown","thumb-down"]],["Terakhir diperbarui pada 2025-08-25 UTC."],[[["\u003cp\u003eThis page provides instructions on importing a DMP file, created using the \u003ccode\u003epg_dump\u003c/code\u003e tool in \u003ccode\u003ecustom\u003c/code\u003e or \u003ccode\u003edirectory\u003c/code\u003e format, into an AlloyDB database.\u003c/p\u003e\n"],["\u003cp\u003eThe import process involves uploading the DMP file to a Cloud Storage bucket, preparing a Compute Engine VM client host, and then using the \u003ccode\u003epg_restore\u003c/code\u003e tool to import the file.\u003c/p\u003e\n"],["\u003cp\u003eUsers must have specific IAM roles, such as Owner, Editor, AlloyDB Admin, or others, to perform the import.\u003c/p\u003e\n"],["\u003cp\u003eA TOC file needs to be generated by running the \u003ccode\u003epg_restore\u003c/code\u003e command with \u003ccode\u003e-l\u003c/code\u003e flag, then using the \u003ccode\u003esed\u003c/code\u003e command to comment out all EXTENSION statements before importing.\u003c/p\u003e\n"],["\u003cp\u003eAfter a successful import, resources like the Cloud Storage bucket and the Compute Engine VM used for the import can be deleted.\u003c/p\u003e\n"]]],[],null,["# Import a DMP file\n\nThis page describes how to import into an AlloyDB database a DMP\nfile created by the `pg_dump` tool using the `custom` or `directory` format.\n\nTo import a file created by the `pg_dump` tool using the `plain` format, see\n[Import a SQL file](/alloydb/docs/import-sql-file).\n\nThe procedure to perform the import involves these tasks:\n\n1. [Upload the DMP file](#upload) to a Cloud Storage bucket.\n\n2. [Prepare a client host](#prepare-host) to perform the import operation.\n\n3. [Import the DMP file](#import) into the database.\n\n4. [Clean up the resources](#clean-up) created to perform the procedure.\n\n\nBefore you begin\n----------------\n\n- You must have the Owner (`roles/owner`) or Editor (`roles/editor`) basic IAM role in the Google Cloud project you are using, or you must have these predefined IAM roles:\n - AlloyDB Admin (`roles/alloydb.admin`) or AlloyDB Viewer (`roles/alloydb.viewer`)\n - Storage Admin (`roles/storage.admin`)\n - Compute Instance Admin (v1) (`roles/compute.instanceAdmin.v1`)\n\n\u003cbr /\u003e\n\nUpload the DMP file\n-------------------\n\nTo upload the DMP file, you create a Cloud Storage bucket and then\nupload the DMP file to that bucket.\n\n1. [Create a standard storage, regional storage bucket](/storage/docs/creating-buckets)\n in the project and region where your AlloyDB database is\n located.\n\n2. [Upload the DMP file](/storage/docs/uploading-objects) to the storage bucket\n you created.\n\nPrepare a client host\n---------------------\n\nTo prepare a client host to perform the import operation, you create a\nCompute Engine VM that can connect to the AlloyDB primary\ninstance where your database is located, and install the `pg_restore` tool and\nthe Google Cloud CLI on that VM.\n\n1. Follow the instructions\n [Connect a psql client to an instance](/alloydb/docs/connect-psql)\n to create a Compute Engine VM with the proper connectivity and the\n `pg_restore` tool installed. When following these instructions, make sure to\n allocate enough local storage to the Compute Engine VM to\n accommodate the DMP file you are importing.\n\n2. [Install the gcloud CLI](/sdk/docs/install) to provide\n command-line access to the DMP file in the Cloud Storage bucket.\n\nImport the DMP file\n-------------------\n\nTo import the DMP file, you get the IP address of the AlloyDB\nprimary instance where your database is located and then use the `pg_restore`\ntool to import the file into the database.\n\n1. Get the IP address of the AlloyDB primary instance where your database is located by [viewing its\n details](/alloydb/docs/instance-view).\n2. SSH into the Compute Engine VM. \n\n ### Console\n\n 1. In the Google Cloud console, go to the **VM instances** page.\n\n [Go to VM instances](https://console.cloud.google.com/compute/instances)\n 2. In the list of virtual machine instances, click **SSH** in the row of the instance you created.\n\n ### gcloud\n\n Use the [`gcloud compute\n ssh` command](/sdk/gcloud/reference/compute/ssh) to connect to the instance you created. \n\n ```\n gcloud compute ssh --project=PROJECT_ID --zone=ZONE VM_NAME\n ```\n\n Replace the following:\n - \u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e: The ID of the project that contains the instance.\n - \u003cvar translate=\"no\"\u003eZONE\u003c/var\u003e: The name of the zone in which the instance is located.\n - \u003cvar translate=\"no\"\u003eVM_NAME\u003c/var\u003e: The name of the instance.\n3. Copy the DMP file to the client host's local file system: \n\n ```bash\n gcloud storage cp gs://BUCKET_NAME/DMP_FILE_NAME .\n ```\n4. Run the following command to create a TOC file that comments out all `EXTENSION` statements: \n\n ```bash\n pg_restore \\\n -l DMP_FILE_NAME | sed -E 's/(.* EXTENSION )/; \\1/g' \u003e TOC_FILE_NAME\n ```\n - \u003cvar translate=\"no\"\u003eDMP_FILE_NAME\u003c/var\u003e: The DMP file on the local file system.\n - \u003cvar translate=\"no\"\u003eTOC_FILE_NAME\u003c/var\u003e: Provide a file name for the TOC file to create on the local file system.\n5. Import the DMP file: \n\n ```\n pg_restore -h IP_ADDRESS -U postgres \\\n -d DB_NAME \\\n -L TOC_FILE_NAME \\\n DMP_FILE_NAME\n ```\n - \u003cvar translate=\"no\"\u003eIP_ADDRESS\u003c/var\u003e: The IP address of the primary instance.\n - \u003cvar translate=\"no\"\u003eDB_NAME\u003c/var\u003e: The name of the database to import into.\n - \u003cvar translate=\"no\"\u003eTOC_FILE_NAME\u003c/var\u003e: The TOC file you created in the previous step.\n - \u003cvar translate=\"no\"\u003eDMP_FILE_NAME\u003c/var\u003e: The DMP file.\n\n The [`pg_restore`\n command](https://www.postgresql.org/docs/16/app-pgrestore.html) provides several additional options to control the data import\n operation.\n\nClean up resources\n------------------\n\nAfter successfully importing the DMP file, you can [delete the\nCloud Storage bucket](/storage/docs/deleting-buckets) and delete the\n[Compute Engine VM](/compute/docs/instances/deleting-instance)\nyou used during the import procedure."]]