Configure access to a source: Microsoft Azure Storage

You can configure access to source data in Microsoft Azure Storage using shared access signatures (SAS).

Supported regions

Storage Transfer Service is able to transfer data from the following Microsoft Azure Storage regions:
  • Americas: East US, East US 2, West US, West US 2, West US 3, Central US, North Central US, South Central US, West Central US, Canada Central, Canada East, Brazil South
  • Asia-Pacific: Australia Central, Australia East, Australia Southeast, Central India, South India, West India, Southeast Asia, East Asia, Japan East, Japan West, Korea South, Korea Central
  • Europe, Middle East, Africa (EMEA): France Central, Germany West Central, Norway East, Sweden Central, Switzerland North, North Europe, West Europe, UK South, UK West, Qatar Central, UAE North, South Africa North

Configure access

Follow these steps to configure access to a Microsoft Azure Storage container:

  1. Create or use an existing Microsoft Azure Storage user to access the storage account for your Microsoft Azure Storage Blob container.

  2. Create an SAS token at the container level. See Grant limited access to Azure Storage resources using shared access signatures for instructions.

    1. The Allowed services must include Blob.

    2. For Allowed resource types select both Container and Object.

    3. The Allowed permissions must include Read and List. If the transfer is configured to delete objects from source, you must also include Delete permission.

    4. The default expiration time for SAS tokens is 8 hours. Set a reasonable expiration time that enables you to successfully complete your transfer.

    5. Do not specify any IP addresses in the Allowed IP addresses field. Storage Transfer Service uses various IP addresses and doesn't support IP address restriction.

    6. The Allowed protocols should be HTTPS only.

  3. Once the token is created, note the SAS token value that is returned. You need this value when configuring your transfer with Storage Transfer Service.

Save your Microsoft credentials in Secret Manager

Secret Manager is a secure service that stores and manages sensitive data such as passwords. It uses strong encryption, role-based access control, and audit logging to protect your secrets.

Storage Transfer Service can leverage Secret Manager to protect your Azure credentials. Storage Transfer Service supports both shared access signature (SAS) tokens and Azure Shared Keys in Secret Manager.

When you specify a Shared Key, Storage Transfer Service uses that key to generate a service SAS that is restricted in scope to the Azure container specified in the transfer job.

Enable the API

Enable the Secret Manager API.

Enable the API

Configure additional permissions

User permissions

The user creating the secret requires the following role:

  • Secret Manager Admin (roles/secretmanager.admin)

Learn how to grant a role.

Service agent permissions

The Storage Transfer Service service agent requires the following IAM role:

  • Secret Manager Secret Accessor (roles/secretmanager.secretAccessor)

To grant the role to your service agent:

Cloud console

  1. Follow the instructions to retrieve your service agent email.

  2. Go to the IAM page in the Google Cloud console.

    Go to IAM

  3. Click Grant access.

  4. In the New principals text box, enter the service agent email.

  5. In the Select a role drop-down, search for and select Secret Manager Secret Accessor.

  6. Click Save.


Use the gcloud projects add-iam-policy-binding command to add the IAM role to your service agent.

  1. Follow the instructions to retrieve your service agent email.

  2. From the command line, enter the following command:

    gcloud projects add-iam-policy-binding PROJECT_ID \
      --member='serviceAccount:SERVICE_AGENT_EMAIL' \

Create a secret

Create a secret with Secret Manager:

Cloud console

  1. Go to the Secret Manager page in the Google Cloud console.

    Go to Secret Manager

  2. Click Create secret.

  3. Enter a name.

  4. In the Secret value text box, enter your credentials in one of the following formats.

      "sas_token" : "SAS_TOKEN_VALUE"


      "access_key" : "ACCESS_KEY"
  5. Click Create secret.

  6. Once the secret has been created, note the secret's full resource name:

    1. Select the Overview tab.

    2. Copy the value of Resource ID. It uses the following format:



To create a new secret using the gcloud command-line tool, pass the JSON-formatted credentials to the gcloud secrets create command:

printf '{
  "sas_token" : "SAS_TOKEN_VALUE"
}' | gcloud secrets create SECRET_NAME --data-file=-


printf '{
  "access_key" : "ACCESS_KEY"
}' | gcloud secrets create SECRET_NAME --data-file=-

Retrieve the secret's full resource name:

gcloud secrets describe SECRET_NAME

Note the value of name in the response. It uses the following format:


For more details about creating and managing secrets, refer to the Secret Manager documentation.

Pass your secret to the job creation command

Using Secret Manager with Storage Transfer Service requires using the REST API to create a transfer job.

Pass the Secret Manager resource name as the value of the transferSpec.azureBlobStorageDataSource.credentialsSecret field:


  "description": "Transfer with Secret Manager",
  "status": "ENABLED",
  "projectId": "PROJECT_ID",
  "transferSpec": {
    "azureBlobStorageDataSource": {
      "storageAccount": "AZURE_SOURCE_NAME",
      "container": "AZURE_CONTAINER",
      "credentialsSecret": "SECRET_RESOURCE_ID",
    "gcsDataSink": {
      "bucketName": "CLOUD_STORAGE_BUCKET_NAME"

IP restrictions

If you restrict access to your Azure resources using an Azure Storage firewall, you must add the IP ranges used by Storage Transfer Service workers to your list of allowed IPs.

Because these IP ranges can change, we publish the current values as a JSON file at a permanent address:

When a new range is added to the file, we'll wait at least 7 days before using that range for requests from Storage Transfer Service.

We recommend that you pull data from this document at least weekly to keep your security configuration up to date. For a sample Python script that fetches IP ranges from a JSON file, see this article from the Virtual Private Cloud documentation.

To add these ranges as allowed IPs, follow the instructions in the Microsoft Azure article, Configure Azure Storage firewalls and virtual networks.