This page describes exporting and importing data into Cloud SQL instances using BAK files and importing data into Cloud SQL instances using transaction log files.
Before you begin
Exports use database resources, but exports don't interfere with normal database operations unless the instance is under-provisioned.
For best practices, see Best Practices for Importing and Exporting Data.
After completing an import operation, verify the results.
Export data from Cloud SQL for SQL Server
Cloud SQL supports the export of built-in BAK files.
If you aim to create a new instance from an exported file, then consider restoring from a backup to a different instance or cloning the instance.
Cloud SQL performs a full backup of the selected database during an export operation.
Required roles and permissions for exporting from Cloud SQL for SQL Server
To export data from Cloud SQL into Cloud Storage, the user initiating the export must have one of the following roles:
- The Cloud SQL Editor role
- A custom role,
including the following permissions:
cloudsql.instances.get
cloudsql.instances.export
Additionally, the service account for the Cloud SQL instance must have one of the following roles:
- The
storage.objectAdmin
Identity and Access Management (IAM) role - A custom role, including the following permissions:
storage.objects.create
storage.objects.list
(for striped export only)storage.objects.delete
(for striped export only)storage.buckets.getIamPolicy
(for exporting transaction logs only)storage.buckets.setIamPolicy
(for exporting transaction logs only)
For help with IAM roles, see Identity and Access Management.
Export data to a BAK file from Cloud SQL for SQL Server
Console
-
In the Google Cloud console, go to the Cloud SQL Instances page.
- To open the Overview page of an instance, click the instance name.
- Click Export.
- In the File format section, click BAK.
- In the Data to export section, use the drop-down menu to select the database you want to export from.
- In the Destination section, select Browse to search for a Cloud Storage bucket or folder for your export.
- Click Export to begin the export.
gcloud
- Create a Cloud Storage bucket.
- Find the service account for the Cloud SQL instance you're exporting
from. You can do this running the
gcloud sql instances describe
command. Look for theserviceAccountEmailAddress
field in the output.gcloud sql instances describe INSTANCE_NAME
- Use
gcloud storage buckets add-iam-policy-binding
to grant thestorage.objectAdmin
IAM role to the service account. For more information about setting IAM permissions, see Using IAM permissions. - Export the database:
gcloud sql export bak INSTANCE_NAME gs://BUCKET_NAME/FILENAME \ --database=DATABASE_NAME
For information about using the
gcloud sql export bak
command, see the command reference page. - If you don't need to retain the IAM role you set previously, then revoke it now.
REST v1
-
Create a Cloud Storage bucket for the export.
gcloud storage buckets create gs://BUCKET_NAME --location=LOCATION_NAME --project=PROJECT_NAME
This step isn't required, but strongly recommended, so you don't open up access to any other data.
- Provide your instance with the
legacyBucketWriter
IAM role for your bucket. For more information about setting IAM permissions, see Using IAM permissions. - Export your database:
Before using any of the request data, make the following replacements:
- PROJECT_ID: the project ID
- INSTANCE_ID: the instance ID
- BUCKET_NAME: the Cloud Storage bucket name
- PATH_TO_DUMP_FILE: the path to the SQL dump file
- DATABASE_NAME_1: the name of a database inside the Cloud SQL instance
- DATABASE_NAME_2: the name of a database inside the Cloud SQL instance
HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/export
Request JSON body:
To send your request, expand one of these options:
You should receive a JSON response similar to the following:
- If you don't need to retain the IAM role you set previously, then remove it now.
REST v1beta4
-
Create a Cloud Storage bucket for the export.
gcloud storage buckets create gs://BUCKET_NAME --location=LOCATION_NAME --project=PROJECT_NAME
This step isn't required, but strongly recommended, so you don't open up access to any other data.
- Provide your instance with the
storage.objectAdmin
IAM role for your bucket. For more information about setting IAM permissions, see Using IAM permissions. - Export your database:
Before using any of the request data, make the following replacements:
- PROJECT_ID: the project ID
- INSTANCE_ID: the instance ID
- BUCKET_NAME: the Cloud Storage bucket name
- PATH_TO_DUMP_FILE: the path to the SQL dump file
- DATABASE_NAME_1: the name of a database inside the Cloud SQL instance
- DATABASE_NAME_2: the name of a database inside the Cloud SQL instance
HTTP method and URL:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/instances/INSTANCE_ID/export
Request JSON body:
To send your request, expand one of these options:
You should receive a JSON response similar to the following:
- If you don't need to retain the IAM role you set previously, then revoke it now.
Export differential database backups
Before exporting a differential database backup, you must export a differential base.
If other services or features, such as point-in-time recovery and read replica, trigger a full backup between your full backup export and differential backup export, then you must trigger a full backup export again.
To understand this better, consider the following example:
- You raise a full backup request at 7:00 AM.
- You enable point-in-time recovery at 9:00 AM. This triggers a full backup on your instance.
- You try to take a differential backup at 5:00 PM. This export request fails with an error message because the last full backup was triggered by point-in-time recovery.
Cloud SQL doesn't support database export requests with --differential-base
or --bak-type=DIFF
on replica instances.
gcloud
- Create a Cloud Storage bucket.
- Find the service account for the Cloud SQL instance from which you're exporting.
You can do this running the
gcloud sql instances describe
command. Look for theserviceAccountEmailAddress
field in the output.gcloud sql instances describe INSTANCE_NAME
- Use
gcloud storage buckets add-iam-policy-binding
to grant thestorage.objectAdmin
IAM role to the service account. For more information about setting IAM permissions, see Using IAM permissions. Export the database as the differential base.
gcloud sql export bak INSTANCE_NAME gs://BUCKET_NAME/DIFFERENTIAL_BASE_FILENAME \ --database=DATABASE_NAME --differential-base
For information about using the
gcloud sql export bak
command, see the command reference page.Export a differential backup.
gcloud sql export bak INSTANCE_NAME gs://BUCKET_NAME/DIFFERENTIAL_BACKUP_FILENAME \ --database=DATABASE_NAME --bak-type=DIFF
For information about using the
gcloud sql export bak
command, see the command reference page.- If you don't need to retain the IAM role you set previously, then revoke it now.
REST v1
-
Create a Cloud Storage bucket for the export.
gcloud storage buckets create gs://BUCKET_NAME --location=LOCATION_NAME --project=PROJECT_NAME
This step isn't required, but strongly recommended, so you don't open up access to any other data.
- Provide your instance with the
legacyBucketWriter
IAM role for your bucket. For more information about setting IAM permissions, see Using IAM permissions. - Export your full database backup as the differential base.
Before using any of the request data, make the following replacements:
- PROJECT_ID: the project ID
- INSTANCE_ID: the instance ID
- BUCKET_NAME: the Cloud Storage bucket name
- PATH_TO_BAK_FILE: the path to the SQL BAK file
- DATABASE_NAME: the name of the database inside of the Cloud SQL instance
offload
: to enable and use serverless export, set this value to TRUE
HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/export
Request JSON body:
{ "exportContext": { "fileType": "BAK", "uri": "gs://BUCKET_NAME/PATH_TO_BAK_FILE", "databases": ["DATABASE_NAME"] "offload": TRUE | FALSE "bakExportOptions": { "differentialBase":true } } }
To send your request, choose one of these options:
curl
Save the request body in a file named
request.json
, and execute the following command:curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/export"PowerShell
Save the request body in a file named
request.json
, and execute the following command:$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/export" | Select-Object -Expand ContentYou should receive a JSON response similar to the following:
- Export a differential backup.
Before using any of the request data, make the following replacements:
- project-id: The project ID
- instance-id: The instance ID
- bucket_name: The Cloud Storage bucket name
- path_to_dump_file: The path to the SQL dump fle
- database_name_1: The name of a database inside the Cloud SQL instance
- database_name_2: The name of a database inside the Cloud SQL instance
- offload: Enables serverless export. Set to
true
to use serverless export.
HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/export
Request JSON body:
{ "exportContext": { "fileType": "BAK", "uri": "gs://bucket_name/path_to_dump_file", "databases": ["database_name"] "offload": true | false "bakExportOptions": { bakType:"DIFF" } } }
To send your request, choose one of these options:
curl
Save the request body in a file named
request.json
, and execute the following command:curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/export"PowerShell
Save the request body in a file named
request.json
, and execute the following command:$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/export" | Select-Object -Expand ContentYou should receive a JSON response similar to the following:
- If you don't need to retain the IAM role you set previously, then remove it now.
REST v1beta4
-
Create a Cloud Storage bucket for the export.
gcloud storage buckets create gs://BUCKET_NAME --location=LOCATION_NAME --project=PROJECT_NAME
This step isn't required, but strongly recommended, so you don't open up access to any other data.
- Provide your instance with the
storage.objectAdmin
IAM role for your bucket. For more information about setting IAM permissions, see Using IAM permissions. - Export your full database backup as the differential base.
Before using any of the request data, make the following replacements:
- project-id: the project ID
- instance-id: the instance ID
- bucket_name: the Cloud Storage bucket name
- path_to_dump_file: the path to the SQL dump fle
- database_name_1: the name of a database inside the Cloud SQL instance
- database_name_2: the name of a database inside the Cloud SQL instance
- offload: to enable and use serverless export, set its value to
true
.
HTTP method and URL:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/export
Request JSON body:
{ "exportContext": { "fileType": "BAK", "uri": "gs://bucket_name/path_to_dump_file", "databases": ["database_name"] "offload": true | false "bakExportOptions": { "differentialBase":true } } }
To send your request, choose one of these options:
curl
Save the request body in a file named
request.json
, and execute the following command:curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/export"PowerShell
Save the request body in a file named
request.json
, and execute the following command:$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/export" | Select-Object -Expand ContentYou should receive a JSON response similar to the following:
- Export a differential backup:
Before using any of the request data, make the following replacements:
- project-id: the project ID
- instance-id: the instance ID
- bucket_name: the Cloud Storage bucket name
- path_to_dump_file: the path to the SQL dump fle
- database_name_1: the name of a database inside the Cloud SQL instance
- database_name_2: the name of a database inside the Cloud SQL instance
- offload: to enable and use serverless exports, set this value to
true
.
HTTP method and URL:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/export
Request JSON body:
{ "exportContext": { "fileType": "BAK", "uri": "gs://bucket_name/path_to_dump_file", "databases": ["database_name"] "offload": true | false "bakExportOptions": { bakType:"DIFF" } } }
To send your request, choose one of these options:
curl
Save the request body in a file named
request.json
, and execute the following command:curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/export"PowerShell
Save the request body in a file named
request.json
, and execute the following command:$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/export" | Select-Object -Expand ContentYou should receive a JSON response similar to the following:
- If you don't need to retain the IAM role you set previously, then revoke it now.
Export transaction logs
You can export the transaction logs for Cloud SQL Enterprise Plus edition instances that have point-in-time recovery (PITR) enabled and their logs stored in Cloud Storage.
gcloud
-
Create a Cloud Storage bucket for the export.
gcloud storage buckets create gs://BUCKET_NAME --location=LOCATION_NAME --project=PROJECT_NAME
This step isn't required, but strongly recommended, so you don't open up access to any other data.
- Find the service account for the Cloud SQL instance from which you're exporting.
You can do this by running the
gcloud sql instances describe
command. Look for theserviceAccountEmailAddress
field in the output.gcloud sql instances describe INSTANCE_NAME
- Use
gcloud storage buckets add-iam-policy-binding
to grant thestorage.Admin
IAM role to the service account. For more information about setting IAM permissions, see Set and manage IAM policies on buckets. Export the transaction logs.
gcloud sql export bak INSTANCE_NAME gs://BUCKET_NAME/FOLDER_PATH --export-log-start-time=START_DATE_AND_TIME / --export-log-end-time=END_DATE_AND_TIME / --database=DATABASE_NAME --bak-type=TLOG
For information about using the
gcloud sql export bak
command, see the command reference page.- If you don't need to retain the IAM role you set previously, then revoke it now.
REST
Create a Cloud Storage bucket.
This step isn't required, but strongly recommended, so you don't open up access to any other data.
- Provide your instance with the
storage.Admin
IAM role for your bucket. For more information about setting IAM permissions, see Set and manage IAM policies on buckets. - Export the transaction logs.
Before using any of the request data, make the following replacements:
- PROJECT_ID: the project ID.
- INSTANCE_ID: the instance ID.
- BUCKET_NAME: the Cloud Storage bucket name.
- FOLDER_PATH: the path to the folder in the Cloud Storage bucket. Cloud SQL exports the transaction logs to this folder.
- DATABASE_NAME: the name of the database inside of the Cloud SQL instance.
exportLogStartTime
: the start date and time of the trasnasction logs to export.exportLogEndTime
: the end date and time of the transaction logs to export.
- If you don't need to retain the IAM role you set previously, then remove it now.
HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/export
Request JSON body:
{ "exportContext": { "fileType": "BAK", "uri": "gs://BUCKET_NAME/FOLDER_PATH", "databases": ["DATABASE_NAME"] "bakExportOptions": { bakType:"TLOG" exportLogStartTime: START_DATE_AND_TIME exportLogEndTime: END_DATE_AND_TIME } } }
To send your request, choose one of these options:
curl
Save the request body in a file named request.json
,
and execute the following command:
curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/export"
PowerShell
Save the request body in a file named request.json
,
and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/export" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Use striped export
The advantages of striped export are the following:
- Reductions in the time needed for operations to complete
- Databases larger than 5 TB can be exported
A potential disadvantage of using striped export is that the backup, rather than consisting of one file, is split across a set of files. This set is called a "stripe set"; see Backup devices in a striped media set (a stripe set). In Cloud SQL, you export to an empty folder in Cloud Storage instead of generating a single file. For more information, see How to use striped export.
Planning your operations
Striped export can improve the performance of exports. However, if your use case requires a single output file, or if your database is less than 5 TB in size, and if faster performance isn't critical, you may want to use a non-striped export.
If you decide to use striped export, then consider the number of stripes. You can specify this value in your gcloud CLI command or REST API call. However, if you want an optimal number of stripes for performance, or if you don't know a number, omit the number. An optimal number of stripes is set automatically.
The maximum number of stripes currently supported by Cloud SQL for SQL Server is 64.
How to use striped export
gcloud
- Create a Cloud Storage bucket.
- Find the service account for the Cloud SQL instance you're exporting
from. You can do this by running the
gcloud sql instances describe
command. Look for theserviceAccountEmailAddress
field in the output.gcloud sql instances describe INSTANCE_NAME
- Use
gcloud storage buckets add-iam-policy-binding
to grant thestorage.objectAdmin
IAM role to the service account. For more information about setting IAM permissions, see Using IAM permissions. - To export the database, specify the
--striped
parameter and/or specify a value for--stripe_count
. Setting a value for--stripe_count
implies that the--striped
parameter is intended. An error occurs if you specify--no-striped
but specify a value for--stripe_count
:gcloud beta sql export bak INSTANCE_NAME \ gs://BUCKET_NAME/STRIPED_EXPORT_FOLDER \ --database=DATABASE_NAME --striped --stripe_count=NUMBER
For information about using the
gcloud beta sql export bak
command, see the command reference page. - If you don't need to retain the IAM role you set previously, then revoke it now.
REST v1
-
Create a Cloud Storage bucket for the export.
gcloud storage buckets create gs://BUCKET_NAME --location=LOCATION_NAME --project=PROJECT_NAME
This step isn't required, but strongly recommended, so you don't open up access to any other data.
- Provide your instance with the
legacyBucketWriter
IAM role for your bucket. For more information about setting IAM permissions, see Using IAM permissions. - Export your database:
Before using any of the request data, make the following replacements:
- project-id: The project ID
- instance-id: The instance ID
- bucket_name: The Cloud Storage bucket name
- path_to_folder: The path to the folder (in the Cloud Storage bucket) to which to export the striped set to
- database_name: The name of a database in your Cloud SQL instance
- true | false: Set to
true
to use striped export. If you specifytrue
without specifying a stripe count, an optimal number of stripes is set automatically - number_of_stripes: The number of stripes to use. If specified,
striped
is implied astrue
HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/export
Request JSON body:
{ "exportContext": { "fileType": "BAK", "uri": "gs://bucket_name/path_to_folder", "databases": ["database_name"], "bakExportOptions": { "striped": true | false, "stripe_count": ["number_of_stripes"] } } }
To send your request, expand one of these options:
You should receive a JSON response similar to the following:
- If you don't need to retain the IAM role you set previously, then remove it now.
REST v1beta4
-
Create a Cloud Storage bucket for the export.
gcloud storage buckets create gs://BUCKET_NAME --location=LOCATION_NAME --project=PROJECT_NAME
This step isn't required, but is strongly recommended, so you don't open up access to any other data.
- Provide your instance with the
legacyBucketWriter
IAM role for your bucket. For more information about setting IAM permissions, see Using IAM permissions. - Export your database:
Before using any of the request data, make the following replacements:
- project-id: The project ID
- instance-id: The instance ID
- bucket_name: The Cloud Storage bucket name
- path_to_folder: The path to the folder (in the Cloud Storage bucket) to which to export the striped set
- database_name: The name of a database in your Cloud SQL instance
- true | false: Set to
true
to use striped export. If you specifytrue
without specifying a stripe count, an optimal number of stripes is set automatically - number_of_stripes: The number of stripes to use. If specified,
striped
is implied astrue
HTTP method and URL:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/export
Request JSON body:
{ "exportContext": { "fileType": "BAK", "uri": "gs://bucket_name/path_to_folder", "databases": ["database_name"], "bakExportOptions": { "striped": true | false, "stripe_count": ["number_of_stripes"] } } }
To send your request, expand one of these options:
You should receive a JSON response similar to the following:
- If you don't need to retain the IAM role you set previously, then remove it now.
Import to Cloud SQL for SQL Server
Required roles and permissions for importing to Cloud SQL for SQL Server
To import data from Cloud Storage into Cloud SQL, the user initiating the import must have one of the following roles:
- The Cloud SQL Admin role
- A custom role,
including the following permissions:
cloudsql.instances.get
cloudsql.instances.import
Additionally, the service account for the Cloud SQL instance must have one of the following roles:
- The
storage.objectAdmin
IAM role - A custom role, including the following permissions:
storage.objects.get
storage.objects.list
(for striped import only)
For help with IAM roles, see Identity and Access Management.
Import data from a BAK file to Cloud SQL for SQL Server
To use striped import, see Use striped import.
Various import frameworks are available. For example, Cloud SQL for SQL Server supports change data capture (CDC) for the following database versions:
- SQL Server 2017 Standard
- SQL Server 2017 Enterprise
- SQL Server 2019 Standard
- SQL Server 2019 Enterprise
When importing a CDC-enabled database, the KEEP_CDC flag is retained.
If your instance version is a Microsoft SQL Server Enterprise Edition, then you can import encrypted BAK files.
Microsoft SQL Server Standard Edition instances also import encrypted BAK files, but only through gcloud CLI.
The only supported BAK extensions are
.bak
and .bak.gz
. GPG encrypted backups
are not currently supported.
For the instructions below, prepare to specify a new database; don't create a database before starting the import of your BAK file.
To import data to a Cloud SQL instance using a BAK file:
Console
-
In the Google Cloud console, go to the Cloud SQL Instances page.
- To open the Overview page of an instance, click the instance name.
- Click Import.
- In the Choose the file you'd like to import data from field,
enter the path to the bucket and BAK file to use for the import.
You can import a compressed (
.gz
) or an uncompressed file. - In the Format section, select BAK.
- Specify the Database in your Cloud SQL instance where you want to import the BAK file.
- Click the Import to start the import.
gcloud
Create a Cloud Storage bucket for the import.
gcloud storage buckets create gs://BUCKET_NAME --location=LOCATION_NAME --project=PROJECT_NAME
This step isn't required, but strongly recommended, so you don't open up access to any other data.
- Make sure you have configured the required roles and permissions.
- Upload the data from the BAK file to the bucket.
- Describe the instance that you are importing to:
gcloud sql instances describe INSTANCE_NAME
- Copy the
serviceAccountEmailAddress
field. - Use
gcloud storage buckets add-iam-policy-binding
to grant thestorage.objectViewer
IAM role to the service account for the bucket. For more information about setting IAM permissions, see Using IAM permissions. - Import the data from the file:
gcloud sql import bak INSTANCE_NAME gs://BUCKET_NAME/FILE_NAME \ --database=DATABASE_NAME
For encrypted imports, use the following command:gcloud sql import bak INSTANCE_NAME gs://BUCKET_NAME/FILE_NAME --database=DATABASE_NAME --cert-path=gs://BUCKET_NAME/CERTIFICATE_NAME --pvk-path=gs://BUCKET_NAME/KEY_NAME --prompt-for-pvk-password
- If you don't need to retain the IAM permissions you
set previously, then remove them using
gcloud storage buckets remove-iam-policy-binding
.
REST v1
- Create a Cloud Storage bucket.
Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
- Provide your instance with the
storage.objectAdmin
IAM role for your bucket. For more information about setting IAM permissions, see Using IAM permissions. - Import the data from the file:
Before using any of the request data, make the following replacements:
- project-id: The project ID
- instance-id: The instance ID
- bucket_name: The Cloud Storage bucket name
- path_to_bak_file: The path to the BAK file
- database_name: The name of a database inside the Cloud SQL instance
HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/import
Request JSON body:
{ "importContext": { "fileType": "BAK", "uri": "gs://bucket_name/path_to_bak_file", "database": "database_name" } }
To send your request, expand one of these options:
You should receive a JSON response similar to the following:
To use a different user for the import, specify the
For the complete list of parameters for the request, see the instances:import page.importContext.importUser
property. - If you don't need to retain the IAM permissions you set previously, then remove the permissions.
REST v1beta4
- Create a Cloud Storage bucket.
Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
- Provide your instance with the
storage.objectAdmin
IAM role for your bucket. For more information about setting IAM permissions, see Using IAM permissions. - Import the data from the file:
Before using any of the request data, make the following replacements:
- project-id: The project ID
- instance-id: The instance ID
- bucket_name: The Cloud Storage bucket name
- path_to_bak_file: The path to the BAK file
- database_name: The name of a database inside the Cloud SQL instance
HTTP method and URL:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import
Request JSON body:
{ "importContext": { "fileType": "BAK", "uri": "gs://bucket_name/path_to_bak_file", "database": "database_name" } }
To send your request, expand one of these options:
You should receive a JSON response similar to the following:
To use a different user for the import, specify the
For the complete list of parameters for the request, see the instances:import page.importContext.importUser
property. - If you don't need to retain the IAM permissions you set previously, then remove the permissions.
If you get an error such as ERROR_RDBMS
, ensure the BAK
file exists in the bucket and you have the correct
permissions on the bucket. For help configuring access control in
Cloud Storage, see
Create and Manage Access Control Lists.
Import differential database backups
Before you import a differential database backup, you need a full backup import and your database must be in the RESTORING
state after the full backup import.
Cloud SQL doesn't support importing differential database backups on instances that are enabled with point-in-time recovery. This is because importing a database backup with --no-recovery
is a prerequisite for importing differential database backups. Additionally, you can't enable point-in-time recovery on an instance if the database is in the RESTORING
state.
In the case of import failure, do one of the following to enable point-in-time recovery:
Bring the database that's in the
RESTORING
state online by using the--recovery-only
flag.Remove the database.
To import data to a Cloud SQL instance using a differential database backup, perform the following steps:
gcloud
Create a Cloud Storage bucket for the import.
gcloud storage buckets create gs://BUCKET_NAME --location=LOCATION_NAME --project=PROJECT_NAME
This step isn't mandatory, but we strongly recommend that you perform it so that you don't open up access to any other data.
- Make sure that you've configured the required roles and permissions.
- Upload the data from the BAK file to the bucket.
- Describe the instance that you are importing to:
gcloud sql instances describe INSTANCE_NAME
- Copy the
serviceAccountEmailAddress
field. - Use
gcloud storage buckets add-iam-policy-binding
to grant thestorage.objectViewer
IAM role to the service account for the bucket. For more information about setting IAM permissions, see Using IAM permissions. Import a full backup with
--no-recovery
.gcloud sql import bak INSTANCE_NAME gs://BUCKET_NAME/DIFFERENTIAL_BASE_FILENAME \ --database=DATABASE_NAME --bak-type=FULL --no-recovery
-
Import a differential database backup.
gcloud sql import bak INSTANCE_NAME gs://BUCKET_NAME/DIFFERENTIAL_BACKUP_FILENAME \ --database=DATABASE_NAME --bak-type=DIFF --no-recovery
-
Use the
--recovery-only
flag to bring the imported database online. This step is optional, and you need to perform it only if your database is in theRESTORING
state.gcloud sql import bak INSTANCE_NAME \ --database=DATABASE_NAME --recovery-only
- If you don't need to retain the IAM permissions you
set previously, then remove them using
gcloud storage buckets remove-iam-policy-binding
.
REST v1
- Create a Cloud Storage bucket.
Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
- Provide your instance with the
storage.objectAdmin
IAM role for your bucket. For more information about setting IAM permissions, see Using IAM permissions. -
Import a full backup with
noRecovery
.Before using any of the request data, make the following replacements:
- PROJECT_ID: the project ID.
- INSTANCE_ID: the instance ID.
- BUCKET_NAME: the Cloud Storage bucket name.
- PATH_TO_BAK_FILE: the path to the BAK file.
- DATABASE_NAME: the name of a database inside the Cloud SQL instance.
HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import
Request JSON body:
{ "importContext": { "fileType": "BAK", "uri": "gs://BUCKET_NAME/PATH_TO_BAK_FILE", "database": "DATABASE_NAME" "bakImportOptions": { "noRecovery": true, "bakType": "FULL", } } }
To send your request, choose one of these options:
curl
Save the request body in a file named
request.json
, and execute the following command:curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import"PowerShell
Save the request body in a file named
request.json
, and execute the following command:$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import" | Select-Object -Expand ContentYou should receive a JSON response similar to the following:
- Import a differential database backup.
Before using any of the request data, make the following replacements:
- project-id: the project ID
- instance-id: the instance ID
- bucket_name: the Cloud Storage bucket name
- path_to_bak_file: the path to the BAK file
- database_name: the name of a database inside the Cloud SQL instance
HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/import
Request JSON body:
{ "importContext": { "fileType": "BAK", "uri": "gs://bucket_name/path_to_bak_file", "database": "database_name" "bakImportOptions": { "bakType": "DIFF", "noRecovery": true, } } }
To send your request, choose one of these options:
curl
Save the request body in a file named
request.json
, and execute the following command:curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/import"PowerShell
Save the request body in a file named
request.json
, and execute the following command:$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/import" | Select-Object -Expand ContentYou should receive a JSON response similar to the following:
Use
recoveryOnly
to bring the imported database online. This step is optional, and you need to perform it only if your database is in theRESTORING
state.Before using any of the request data, make the following replacements:
- PROJECT_ID: the project ID
- INSTANCE_ID: the instance ID
- BUCKET_NAME: the Cloud Storage bucket name
- PATH_TO_BAK_FILE: the path to the BAK file
- DATABASE_NAME: the name of a database inside the Cloud SQL instance
HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import
Request JSON body:
{ "importContext": { "fileType": "BAK", "uri": "gs://BUCKET_NAME/PATH_TO_BAK_FILE", "database": "DATABASE_NAME" "bakImportOptions": { "recoveryOnly": true, } } }
To send your request, choose one of these options:
curl
Save the request body in a file named
request.json
, and execute the following command:curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import"PowerShell
Save the request body in a file named
request.json
, and execute the following command:$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import" | Select-Object -Expand ContentYou should receive a JSON response similar to the following:
- If you don't need to retain the IAM permissions you set previously, then remove the permissions.
REST v1beta4
- Create a Cloud Storage bucket.
Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
- Provide your instance with the
storage.objectAdmin
IAM role for your bucket. For more information about setting IAM permissions, see Using IAM permissions. -
Import a full backup with
noRecovery
.Before using any of the request data, make the following replacements:
- PROJECT-ID: the project ID.
- INSTANCE_ID: the instance ID.
- BUCKET_NAME: the Cloud Storage bucket name.
- PATH_TO_BAK_FILE: the path to the BAK file.
- DATABASE_NAME: the name of a database inside the Cloud SQL instance.
HTTP method and URL:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT-ID/instances/INSTANCE_ID/import
Request JSON body:
{ "importContext": { "fileType": "BAK", "uri": "gs://BUCKET_NAME/PATH_TO_BAK_FILE", "database": "DATABASE_NAME" "bakImportOptions": { "noRecovery": true, "bakType": "FULL", } } }
To send your request, choose one of these options:
curl
Save the request body in a file named
request.json
, and execute the following command:curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT-ID/instances/INSTANCE_ID/import"PowerShell
Save the request body in a file named
request.json
, and execute the following command:$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT-ID/instances/INSTANCE_ID/import" | Select-Object -Expand ContentYou should receive a JSON response similar to the following:
- Import a differential database backup.
Before using any of the request data, make the following replacements:
- project-id: the project ID
- instance-id: the instance ID
- bucket_name: the Cloud Storage bucket name
- path_to_bak_file: the path to the BAK file
- database_name: the name of a database inside the Cloud SQL instance
HTTP method and URL:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import
Request JSON body:
{ "importContext": { "fileType": "BAK", "uri": "gs://bucket_name/path_to_bak_file", "database": "database_name" "bakImportOptions": { "bakType": "DIFF", "noRecovery": true, } } }
To send your request, choose one of these options:
curl
Save the request body in a file named
request.json
, and execute the following command:curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import"PowerShell
Save the request body in a file named
request.json
, and execute the following command:$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import" | Select-Object -Expand ContentYou should receive a JSON response similar to the following:
Use
recoveryOnly
to bring the imported database online. This step is optional, and you need to perform it only if your database is in theRESTORING
state.Before using any of the request data, make the following replacements:
- PROJECT_ID: the project ID
- INSTANCE_ID: the instance ID
- BUCKET_NAME: the Cloud Storage bucket name
- PATH_TO_BAK_FILE: the path to the BAK file
- DATABASE_NAME: the name of a database inside the Cloud SQL instance
HTTP method and URL:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/instances/INSTANCE_ID/import
Request JSON body:
{ "importContext": { "fileType": "BAK", "uri": "gs://BUCKET_NAME/PATH_TO_BAK_FILE", "database": "DATABASE_NAME" "bakImportOptions": { "recoveryOnly": true, } } }
To send your request, choose one of these options:
curl
Save the request body in a file named
request.json
, and execute the following command:curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/instances/INSTANCE_ID/import"PowerShell
Save the request body in a file named
request.json
, and execute the following command:$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/instances/INSTANCE_ID/import" | Select-Object -Expand ContentYou should receive a JSON response similar to the following:
- If you don't need to retain the IAM permissions you set previously, remove the permissions.
If you get an error such as ERROR_RDBMS
, then ensure that the BAK
file exists in the bucket and you have the correct
permissions on the bucket. For help configuring access control in
Cloud Storage, see
Create and Manage Access Control Lists.
Import transaction log backups
A transaction log is a record of your database's transactions and the modifications made by each transaction. You can use it to re-establish database consistency in the event of a system failure.
To import data to a Cloud SQL instance using a transaction log backup, perform the following steps:
gcloud
Optional: Create a Cloud Storage bucket for the import.
gcloud storage buckets create gs://BUCKET_NAME --location=LOCATION_NAME --project=PROJECT_NAME
- Upload the backup files to the bucket.
- Describe the instance that you are importing to:
gcloud sql instances describe INSTANCE_NAME
- Copy the
serviceAccountEmailAddress
field. - Use
gcloud storage buckets add-iam-policy-binding
to grant thestorage.objectViewer
IAM role to the service account for the bucket. For more information about setting IAM permissions, see Using IAM permissions. Import a full backup using the
--no-recovery
parameter. Ensure that your database is in theRESTORING
state after the full backup import.gcloud sql import bak INSTANCE_NAME gs://BUCKET_NAME/BACKUP_FILENAME \ --database=DATABASE_NAME --bak-type=FULL --no-recovery
- Optional: Import a differential backup.
-
Import a transaction log backup.
gcloud sql import bak INSTANCE_NAME gs://BUCKET_NAME/BACKUP_FILENAME \ --database=DATABASE_NAME --bak-type=TLOG --stop-at=STOP_AT_TIMESTAMP --stop-at-mark=STOP_AT_MARK_NAME --no-recovery
Replace the following:- INSTANCE_NAME: the instance name.
- BUCKET_NAME: the Cloud Storage bucket name.
- BACKUP_FILENAME: the name of your backup file.
- DATABASE_NAME: the name of a database inside the Cloud SQL instance.
- STOP_AT_TIMESTAMP: the timestamp at which the transaction log import must stop. This is an optional field and the value must use the RFC 3339 format.
- STOP_AT_MARK_NAME: the
marked transaction
at which the transaction log import must stop. This is an
optional field and can take any string as its value. If the value is in the format
lsn:log-sequence-number
, then the transaction log import stops at the given log sequence number.
-
Optional: Use the
--recovery-only
flag to bring the imported database online. Perform this step only if your database is in theRESTORING
state.gcloud sql import bak INSTANCE_NAME \ --database=DATABASE_NAME --recovery-only
- If you don't need to retain the IAM permissions you
set previously, then remove them using
gcloud storage buckets remove-iam-policy-binding
.
REST v1
- Create a Cloud Storage bucket.
Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
- Provide your instance with the
storage.objectAdmin
IAM role for your bucket. For more information about setting IAM permissions, see Using IAM permissions.
For the complete list of parameters for the request, see the
instances:import page.
-
Import a full backup with
noRecovery
. Ensure that your database is in theRESTORING
state after the full backup import.Before using any of the request data, make the following replacements:
- PROJECT_ID: the project ID.
- INSTANCE_ID: the instance ID.
- BUCKET_NAME: the Cloud Storage bucket name.
- PATH_TO_BAK_FILE: the path to the BAK file.
- DATABASE_NAME: the name of a database inside the Cloud SQL instance.
HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import
Request JSON body:
{ "importContext": { "fileType": "BAK", "uri": "gs://BUCKET_NAME/PATH_TO_BAK_FILE", "database": "DATABASE_NAME" "bakImportOptions": { "noRecovery": true, "bakType": "FULL", } } }
To send your request, choose one of these options:
curl
Save the request body in a file named
request.json
, and execute the following command:curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import"PowerShell
Save the request body in a file named
request.json
, and execute the following command:$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import" | Select-Object -Expand ContentYou should receive a JSON response similar to the following:
- Optional: Import a differential backup.
- Import a transaction log backup.
Before using any of the request data, make the following replacements:
- PROJECT_ID: the project ID.
- INSTANCE_ID: the instance ID.
- BUCKET_NAME: the Cloud Storage bucket name.
- PATH_TO_TLOG_FILE: the path to the transaction log file.
- DATABASE_NAME: the name of a database inside the Cloud SQL instance.
- STOP_AT_TIMESTAMP: the timestamp at which the transaction log import must stop. This is an optional field and the value must use the RFC 3339 format.
- STOP_AT_MARK_NAME: the
marked transaction
at which the transaction log import must stop. This is an
optional field and can take any string as its value. If the value is in the format
lsn:log-sequence-number
, then the transaction log import stops at the given log sequence number.
HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import
Request JSON body:
{ "importContext": { "fileType": "BAK", "uri": "gs://BUCKET_NAME/PATH_TO_TLOG_FILE", "database": "DATABASE_NAME" "bakImportOptions": { "bakType": "TLOG", "stopAt": STOP_AT_TIMESTAMP, "stopAtMark": STOP_AT_MARK_NAME, "noRecovery": true, } } }
To send your request, choose one of these options:
curl
Save the request body in a file named
request.json
, and execute the following command:curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import"PowerShell
Save the request body in a file named
request.json
, and execute the following command:$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import" | Select-Object -Expand ContentYou should receive a JSON response similar to the following:
Repeat this step until all transaction log backups are imported. Optional: Use
recoveryOnly
to bring the imported database online. Perform this step only if your database is in theRESTORING
state.Before using any of the request data, make the following replacements:
- PROJECT_ID: the project ID
- INSTANCE_ID: the instance ID
- BUCKET_NAME: the Cloud Storage bucket name
- PATH_TO_BAK_FILE: the path to the BAK file
- DATABASE_NAME: the name of a database inside the Cloud SQL instance
HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import
Request JSON body:
{ "importContext": { "fileType": "BAK", "uri": "gs://BUCKET_NAME/PATH_TO_BAK_FILE", "database": "DATABASE_NAME" "bakImportOptions": { "recoveryOnly": true, } } }
To send your request, choose one of these options:
curl
Save the request body in a file named
request.json
, and execute the following command:curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import"PowerShell
Save the request body in a file named
request.json
, and execute the following command:$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import" | Select-Object -Expand ContentYou should receive a JSON response similar to the following:
- If you don't need to retain the IAM permissions you set previously, then remove the permissions.
REST v1beta4
- Create a Cloud Storage bucket.
Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
- Provide your instance with the
storage.objectAdmin
IAM role for your bucket. For more information about setting IAM permissions, see Using IAM permissions. -
Import a full backup with
noRecovery
. Ensure that your database is in theRESTORING
state after the full backup import.Before using any of the request data, make the following replacements:
- PROJECT-ID: the project ID.
- INSTANCE_ID: the instance ID.
- BUCKET_NAME: the Cloud Storage bucket name.
- PATH_TO_BAK_FILE: the path to the BAK file.
- DATABASE_NAME: the name of a database inside the Cloud SQL instance.
HTTP method and URL:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT-ID/instances/INSTANCE_ID/import
Request JSON body:
{ "importContext": { "fileType": "BAK", "uri": "gs://BUCKET_NAME/PATH_TO_BAK_FILE", "database": "DATABASE_NAME" "bakImportOptions": { "noRecovery": true, "bakType": "FULL", } } }
To send your request, choose one of these options:
curl
Save the request body in a file named
request.json
, and execute the following command:curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT-ID/instances/INSTANCE_ID/import"PowerShell
Save the request body in a file named
request.json
, and execute the following command:$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT-ID/instances/INSTANCE_ID/import" | Select-Object -Expand ContentYou should receive a JSON response similar to the following:
- Optional: Import a differential backup.
- Import a transaction log backup. Here,
stopAt
andstopAtMark
are optional fields.Before using any of the request data, make the following replacements:
- PROJECT_ID: the project ID.
- INSTANCE_ID: the instance ID.
- BUCKET_NAME: the Cloud Storage bucket name.
- PATH_TO_BAK_FILE: the path to the BAK file.
- DATABASE_NAME: the name of a database inside the Cloud SQL instance.
- STOP_AT_TIMESTAMP: the timestamp at which the transaction log import must stop. This is an optional field and the value must use the RFC 3339 format.
- STOP_AT_MARK_NAME: the
marked transaction
at which the transaction log import must stop. This is an
optional field and can take any string as its value. If the value is in the format
lsn:log-sequence-number
, then the transaction log import stops at the given log sequence number.
HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import
Request JSON body:
{ "importContext": { "fileType": "BAK", "uri": "gs://BUCKET_NAME/PATH_TO_BAK_FILE", "database": "DATABASE_NAME" "bakImportOptions": { "bakType": "TLOG", "stopAt": STOP_AT_TIMESTAMP, "stopAtMark":STOP_AT_MARK_NAME, "noRecovery": true, } } }
To send your request, choose one of these options:
curl
Save the request body in a file named
request.json
, and execute the following command:curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import"PowerShell
Save the request body in a file named
request.json
, and execute the following command:$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import" | Select-Object -Expand ContentYou should receive a JSON response similar to the following:
Repeat this step until all transaction log backups are imported. Optional: Use
recoveryOnly
to bring the imported database online. Perform this step only if your database is in theRESTORING
state.Before using any of the request data, make the following replacements:
- PROJECT_ID: the project ID
- INSTANCE_ID: the instance ID
- BUCKET_NAME: the Cloud Storage bucket name
- PATH_TO_BAK_FILE: the path to the BAK file
- DATABASE_NAME: the name of a database inside the Cloud SQL instance
HTTP method and URL:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/instances/INSTANCE_ID/import
Request JSON body:
{ "importContext": { "fileType": "BAK", "uri": "gs://BUCKET_NAME/PATH_TO_BAK_FILE", "database": "DATABASE_NAME" "bakImportOptions": { "recoveryOnly": true, } } }
To send your request, choose one of these options:
curl
Save the request body in a file named
request.json
, and execute the following command:curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/instances/INSTANCE_ID/import"PowerShell
Save the request body in a file named
request.json
, and execute the following command:$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/instances/INSTANCE_ID/import" | Select-Object -Expand ContentYou should receive a JSON response similar to the following:
- If you don't need to retain the IAM permissions you set previously, then remove the permissions.
Use striped import
The advantages of striped import are the following:
- Reductions in the time needed for operations to complete
- Databases larger than 5 TB can be imported
A potential disadvantage of using striped import is that all of the files in the striped set (rather than a single file) must be uploaded to the same folder in your Cloud Storage bucket, before you perform the import.
Planning your operations
In most use cases, striped import enables better performance with no disadvantages. However, if you can't back up to a striped set from a given instance, or if your database is less than 5 TB, and if faster performance is not critical, you may want to use a non-striped import.
How to use striped import
gcloud
Create a Cloud Storage bucket for the import.
gcloud storage buckets create gs://BUCKET_NAME --location=LOCATION_NAME --project=PROJECT_NAME
This step isn't required, but strongly recommended, so you don't open up access to any other data.
- Make sure you have configured the required IAM roles and permissions.
- Create a new folder in your bucket.
- To import the database, upload the files of the striped set (of the database) to the new folder. Ensure that all the files are uploaded to the folder, and that the folder contains no extra files.
- Describe the instance you are exporting from:
gcloud sql instances describe INSTANCE_NAME
- Copy the
serviceAccountEmailAddress
field. - Use
gcloud storage buckets add-iam-policy-binding
to grant thestorage.objectViewer
IAM role to the service account for the bucket. For more information about setting IAM permissions, see Using IAM permissions. - Import the data from the folder. The difference from a non-striped import is
the following: The URI links to the name of the folder to which the stripe
set was uploaded, rather than to a single file, and you specify the
--striped
parameter:gcloud beta sql import bak INSTANCE_NAME gs://BUCKET_NAME/FOLDER_NAME \ --database=DATABASE_NAME --striped
- If you don't need to retain the IAM permissions you
set previously, then remove them using
gcloud storage buckets remove-iam-policy-binding
.
REST v1
- Create a Cloud Storage bucket.
Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
- Provide your instance with the
storage.objectAdmin
IAM role for your bucket. For more information about setting IAM permissions, see Using IAM permissions. - Import the data from the file:
Before using any of the request data, make the following replacements:
- project-id: The project ID
- instance-id: The instance ID
- bucket_name: The Cloud Storage bucket name
- path_to_folder: The path to the folder (in the Cloud Storage bucket) where the stripe set is located
- database_name: The name of a database to create in your Cloud SQL instance
- true | false: Set to
true
to use striped import
HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/import
Request JSON body:
{ "importContext": { "fileType": "BAK", "uri": "gs://bucket_name/path_to_folder", "database": "database_name", "bakImportOptions": { "striped": true | false } } }
To send your request, expand one of these options:
You should receive a JSON response similar to the following:
To use a different user for the import, specify the
For the complete list of parameters for the request, see the instances:import page.importContext.importUser
property. - If you don't need to retain the IAM permissions you set previously, then remove the permissions.
REST v1beta4
- Create a Cloud Storage bucket.
Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
- Provide your instance with the
storage.objectAdmin
IAM role for your bucket. For more information about setting IAM permissions, see Using IAM permissions. - Import the data from the file:
Before using any of the request data, make the following replacements:
- project-id: The project ID
- instance-id: The instance ID
- bucket_name: The Cloud Storage bucket name
- path_to_folder: The path to the folder (in the Cloud Storage bucket) where the stripe set is located
- database_name: The name of a database to create in your Cloud SQL instance
- true | false: Set to
true
to use striped import
HTTP method and URL:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import
Request JSON body:
{ "importContext": { "fileType": "BAK", "uri": "gs://bucket_name/path_to_folder", "database": "database_name", "bakImportOptions": { "striped": true | false } } }
To send your request, expand one of these options:
You should receive a JSON response similar to the following:
To use a different user for the import, specify the
For the complete list of parameters for the request, see the instances:import page.importContext.importUser
property. - If you don't need to retain the IAM permissions you set previously, then remove the permissions.
If you get an error such as ERROR_RDBMS
, then ensure that the table
exists. If the table exists, then confirm that you have the correct
permissions on the bucket. For help configuring access control in
Cloud Storage, see
Create and Manage Access Control Lists.
What's next
- Learn how to check the status of import and export operations.
- Learn more about best practices for importing and exporting data.
- Known issues for imports and exports.