[[["易于理解","easyToUnderstand","thumb-up"],["解决了我的问题","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["很难理解","hardToUnderstand","thumb-down"],["信息或示例代码不正确","incorrectInformationOrSampleCode","thumb-down"],["没有我需要的信息/示例","missingTheInformationSamplesINeed","thumb-down"],["翻译问题","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["最后更新时间 (UTC):2025-08-18。"],[],[],null,["# Interoperability with other storage providers\n\nCloud Storage is compatible with some other object storage platforms\nso you can seamlessly integrate data from different sources. This page describes\nCloud Storage tools you can use to manage your cross-platform object\ndata.\n\nXML API\n-------\n\nThe Cloud Storage [XML API](/storage/docs/xml-api/overview) is interoperable with some\ntools and libraries that work with services such as Amazon Simple Storage Service (Amazon S3). To use these tools\nand libraries with Cloud Storage, change the [request endpoint](/storage/docs/request-endpoints)\nthat the tool or library uses to the Cloud Storage URI\n`https://storage.googleapis.com`, and then configure\nthe tool or library to use your Cloud Storage [HMAC keys](/storage/docs/authentication/hmackeys). See\n[Simple migration from Amazon Simple Storage Service (Amazon S3)](/storage/docs/aws-simple-migration) for detailed instructions on getting started.\n\n### Authenticate with the V4 signing process\n\nThe V4 signing process lets you make signed header requests to the\nCloud Storage XML API. After creating a signature using the V4 signing\nprocess, you include the signature in the `Authorization` header of a subsequent\nrequest, which provides authentication. You can create a signature using an RSA\nsignature or your Amazon S3 workflow and HMAC credentials. For more details\nabout authenticating requests, see [Signatures](/storage/docs/authentication/signatures).\n\nGoogle Cloud CLI\n----------------\n\nThe gcloud CLI is the preferred command line tool for accessing\nCloud Storage. It also lets you access and work with other cloud\nstorage services that use HMAC authentication, like Amazon S3. After you add\nyour Amazon S3 credentials to \\~/.aws/credentials, you can start using\n`gcloud storage` commands to manage objects in your Amazon S3 buckets. For\nexample:\n\n- The following command lists the objects in the Amazon S3 bucket\n `my-aws-bucket`:\n\n ```bash\n gcloud storage ls s3://my-aws-bucket\n ```\n- The following command synchronizes data between an Amazon S3 bucket and a\n Cloud Storage bucket:\n\n ```bash\n gcloud storage rsync s3://my-aws-bucket gs://example-bucket --delete-unmatched-destination-objects --recursive\n ```\n\nFor more information, including details on how to optimize this synchronization,\nsee the [`gcloud storage rsync` documentation](/sdk/gcloud/reference/storage/rsync).\n\n#### **Invalid certificate** from Amazon S3 bucket names containing dots\n\nIf you attempt to use the gcloud CLI to access an Amazon S3 bucket\nthat contains a dot in its name, you might receive an `invalid certificate`\nerror. This is because Amazon S3 does not support virtual-hosted bucket URLs\nwith dots in their name. When working with Amazon S3 resources, you can\nconfigure the gcloud CLI to attempt to use path-style bucket URLs\nby setting the `storage/s3_endpoint_url` property to be the following: \n\n```bash\nstorage/s3_endpoint_url https://s3.REGION_CODE.amazonaws.com\n```\n\nWhere \u003cvar translate=\"no\"\u003eREGION_CODE\u003c/var\u003e is the region containing the bucket\nyou are requesting. For example, `us-east-2`.\n\nYou can modify the `storage/s3_endpoint_url` property in one of the following\nways:\n\n- Using the [`gcloud config set` command](/sdk/gcloud/reference/config/set), which applies the property to\n all gcloud CLI commands.\n\n- Creating a [named configuration](/sdk/gcloud/reference/topic/configurations) and applying it on a per-command basis\n using the [`--configuration` project-wide flag](/sdk/gcloud/reference#--configuration).\n\nImporting data with Storage Transfer Service\n--------------------------------------------\n\n[Storage Transfer Service](/storage-transfer-service) lets you import large amounts of online data into\nCloud Storage from Amazon S3 buckets, Microsoft Azure Blob Storage containers, and general\nHTTP/HTTPS locations. Storage Transfer Service can be used to schedule recurring\ntransfers, delete source objects, and select which objects are transferred.\n\nAdditionally, if you use Amazon S3 Event Notifications, you can set up\nStorage Transfer Service [event-driven transfers](/storage-transfer/docs/event-driven-transfers) to listen for such\nnotifications and automatically keep a Cloud Storage bucket\nin sync with a Amazon S3 source.\n\nWhat's next\n-----------\n\n- Quickly complete a [simple migration from Amazon S3 to Cloud Storage](/storage/docs/aws-simple-migration).\n- [Create a signature](/storage/docs/authentication/signatures) for authenticating requests.\n\n\u003cbr /\u003e\n\n*Amazon Simple Storage Service^™^ and Amazon S3^™^ are trademarks of Amazon.com, Inc. or its affiliates in the United States and/or other countries.*\n\n\u003cbr /\u003e"]]