The following commands require the FULLY_QUALIFIED_BUCKET_NAME. Use the GET or DESCRIBE command from the View bucket configuration section to get the fully qualified bucket name.
The following command uploads all text files from the local directory to a bucket:
Perform multipart uploads for large objects, or use multipart
uploads automatically when you have a file to upload that is larger than 15 MB.
In that case, the file splits into multiple parts, with each part being 15 MB in size.
The last part is smaller. Each part uploads separately and reconstructs at
the destination when the transfer completes.
If an upload of one part fails, you can restart the upload without affecting any
of the other parts already uploaded.
There are two options related to multipart uploads:
--disable-multipart: disables multipart uploads for all files.
--multipart-chunk-size-mb=SIZE: sets the size of
each chunk of a multipart upload.
Files bigger than SIZE automatically upload as
multithreaded-multipart. Smaller files upload using the traditional
method. SIZE is in megabytes. The default chunk size is
15 MB. The minimum allowed chunk size is 5 MB, and the maximum is 5 GB.
Download objects from storage buckets
Console
In the navigation menu, click Object Storage.
Click the name of the bucket containing the objects.
Select the checkbox next to the name of the object to download.
For greater customization, you can create your own AEADKey and use it directly when encrypting objects in your bucket. This gives you full control over the encryption key, bypassing the default. Follow Create a key to create a new AEADKey and make sure it's in the same Namespace as the bucket you intend to use. Then, whenever sending the request, make sure the HEADER is configured with x-amz-server-side-encryption: SSE-KMS and x-amz-server-side-encryption-aws-kms-key-id: NAMESPACE_NAME/AEADKey_NAME
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-29 UTC."],[[["\u003cp\u003eThis guide outlines the process of uploading and downloading objects within Google Distributed Cloud (GDC) air-gapped storage buckets, using both the console and command-line interface (CLI).\u003c/p\u003e\n"],["\u003cp\u003eBefore interacting with storage buckets, users must have a project namespace and the appropriate bucket permissions, ensuring access to perform upload and download operations.\u003c/p\u003e\n"],["\u003cp\u003eObject naming should adhere to UTF-8 characters and exclude personally identifiable information (PII) to maintain data integrity and privacy.\u003c/p\u003e\n"],["\u003cp\u003eThe CLI allows for advanced operations like uploading multiple files, entire directories, and managing large objects through multipart uploads, which can be customized with chunk sizes and disabling options.\u003c/p\u003e\n"],["\u003cp\u003eUsers can download objects via the console or CLI, with the added ability to retrieve specific versions of files using the CLI's version listing capabilities, as well as using custom AEADKeys for encryption.\u003c/p\u003e\n"]]],[],null,["# Upload and download storage objects\n\nThis page shows you how to upload and download objects to and from Google Distributed Cloud (GDC) air-gapped storage buckets.\n\nBefore you begin\n----------------\n\nA project namespace manages bucket resources in the Management API server. You\nmust have a [project](/distributed-cloud/hosted/docs/latest/gdch/platform/pa-user/project-management) to work with buckets and objects.\n\nYou must also have the appropriate bucket permissions to perform the following\noperation. See [Grant bucket access](/distributed-cloud/hosted/docs/latest/gdch/platform/pa-user/grant-obtain-storage-access#grant_bucket_access).\n\nObject naming guidelines\n------------------------\n\nUse the following guidelines to name objects:\n\n- Use UTF-8 characters when naming objects.\n- Refrain from including any personally identifiable information (PII).\n\nUpload objects to storage buckets\n---------------------------------\n\n### Console\n\n1. In the navigation menu, click **Object Storage**.\n2. Click the name of the bucket you want to upload the object to.\n3. Optional: If you want to create a folder to store your object, click **Create folder** \\\u003e enter a folder name \\\u003e click **Create**.\n4. Click **Upload file** directly, or navigate into the folder you just created and then click **Upload file**.\n5. Select the desired file and click **Open**.\n6. Wait for the confirmation message that the upload was successful.\n\n### CLI\n\nTo upload an object, run the following commands: \n\n gdcloud storage cp \u003cvar translate=\"no\"\u003eLOCAL_PATH\u003c/var\u003e s3://\u003cvar translate=\"no\"\u003eREMOTE_PATH\u003c/var\u003e\n gdcloud storage cp s3://\u003cvar translate=\"no\"\u003eREMOTE_SOURCE_PATH\u003c/var\u003e s3://\u003cvar translate=\"no\"\u003eREMOTE_MOVE_DESTINATION_PATH\u003c/var\u003e\n gdcloud storage mv s3://\u003cvar translate=\"no\"\u003eREMOTE_SOURCE_PATH\u003c/var\u003e s3://\u003cvar translate=\"no\"\u003eREMOTE_MOVE_DESTINATION_PATH\u003c/var\u003e\n\nThe following commands require the \u003cvar translate=\"no\"\u003eFULLY_QUALIFIED_BUCKET_NAME\u003c/var\u003e. Use the `GET` or `DESCRIBE` command from the [View bucket configuration](/distributed-cloud/hosted/docs/latest/gdch/platform/pa-user/list-view-storage-buckets#view_bucket_configurations) section to get the fully qualified bucket name.\n\nThe following command uploads all text files from the local directory to a bucket: \n\n gdcloud storage cp *.txt s3://\u003cvar translate=\"no\"\u003eFULLY_QUALIFIED_BUCKET_NAME\u003c/var\u003e\n\nThe following command uploads multiple files from the local directory to a bucket: \n\n gdcloud storage cp abc1.txt abc2.txt s3://\u003cvar translate=\"no\"\u003eFULLY_QUALIFIED_BUCKET_NAME\u003c/var\u003e\n\nTo upload a folder to a bucket, use the --recursive option to copy an entire directory tree. The following command uploads the directory tree dir: \n\n gdcloud storage cp dir s3://\u003cvar translate=\"no\"\u003eFULLY_QUALIFIED_BUCKET_NAME\u003c/var\u003e --recursive\n\nPerform multipart uploads for large objects, or use multipart\nuploads automatically when you have a file to upload that is larger than 15 MB.\nIn that case, the file splits into multiple parts, with each part being 15 MB in size.\nThe last part is smaller. Each part uploads separately and reconstructs at\nthe destination when the transfer completes.\n\nIf an upload of one part fails, you can restart the upload without affecting any\nof the other parts already uploaded.\n\nThere are two options related to multipart uploads:\n\n- `--disable-multipart`: disables multipart uploads for all files.\n- `--multipart-chunk-size-mb=`\u003cvar translate=\"no\"\u003eSIZE\u003c/var\u003e: sets the size of each chunk of a multipart upload.\n\nFiles bigger than \u003cvar translate=\"no\"\u003eSIZE\u003c/var\u003e automatically upload as\nmultithreaded-multipart. Smaller files upload using the traditional\nmethod. \u003cvar translate=\"no\"\u003eSIZE\u003c/var\u003e is in megabytes. The default chunk size is\n15 MB. The minimum allowed chunk size is 5 MB, and the maximum is 5 GB.\n\nDownload objects from storage buckets\n-------------------------------------\n\n### Console\n\n1. In the navigation menu, click **Object Storage**.\n2. Click the name of the bucket containing the objects.\n3. Select the checkbox next to the name of the object to download.\n4. Click **Download**.\n\n### CLI\n\nTo get objects from the bucket: \n\n gdcloud storage cp s3://\u003cvar translate=\"no\"\u003eFULLY_QUALIFIED_BUCKET_NAME\u003c/var\u003e/\u003cvar translate=\"no\"\u003eOBJECT\u003c/var\u003e \u003cvar translate=\"no\"\u003eLOCAL_FILE_TO_SAVE\u003c/var\u003e\n\nTo download all text files from a bucket to your current directory: \n\n gdcloud storage cp s3://\u003cvar translate=\"no\"\u003eFULLY_QUALIFIED_BUCKET_NAME\u003c/var\u003e/*.txt .\n\nTo download the text file `abc.txt` from a bucket to your current directory: \n\n gdcloud storage cp s3://\u003cvar translate=\"no\"\u003eFULLY_QUALIFIED_BUCKET_NAME\u003c/var\u003e/abc.txt .\n\nTo download an older version of the file, list all versions of the file first: \n\n gdcloud storage ls s3://\u003cvar translate=\"no\"\u003eFULLY_QUALIFIED_BUCKET_NAME\u003c/var\u003e/abc.txt --all-versions\n\nExample output: \n\n s3://my-bucket/abc.txt#OEQxNTk4MUEtMzEzRS0xMUVFLTk2N0UtQkM4MjAwQkJENjND\n s3://my-bucket/abc.txt#ODgzNEYzQ0MtMzEzRS0xMUVFLTk2NEItMjI1MTAwQkJENjND\n s3://my-bucket/abc.txt#ODNCNDEzNzgtMzEzRS0xMUVFLTlDOUMtQzRDOTAwQjg3RTg3\n\nThen, download a specific version of the text file `abc.txt` from the bucket to your current directory: \n\n gdcloud storage cp s3://\u003cvar translate=\"no\"\u003eFULLY_QUALIFIED_BUCKET_NAME\u003c/var\u003e/abc.txt#OEQxNTk4MUEtMzEzRS0xMUVFLTk2N0UtQkM4MjAwQkJENjND .\n\nUse custom AEADKey\n------------------\n\nFor greater customization, you can create your own AEADKey and use it directly when encrypting objects in your bucket. This gives you full control over the encryption key, bypassing the default. Follow [Create a key](/distributed-cloud/hosted/docs/latest/gdch/application/ao-user/kms/create-delete-keys#create) to create a new AEADKey and make sure it's in the same `Namespace` as the bucket you intend to use. Then, whenever sending the request, make sure the `HEADER` is configured with `x-amz-server-side-encryption: SSE-KMS` and `x-amz-server-side-encryption-aws-kms-key-id: `\u003cvar translate=\"no\"\u003eNAMESPACE_NAME\u003c/var\u003e`/`\u003cvar translate=\"no\"\u003eAEADKey_NAME\u003c/var\u003e"]]