Stay organized with collections
Save and categorize content based on your preferences.
The browser-based Google Cloud console tool lets you manage your Cloud Storage
resources through a graphical interface. Use the Google Cloud console to
manage your data if you prefer using a user interface through the browser.
This page describes how to access the Google Cloud console and lists tasks
in Cloud Storage that can be performed using the Google Cloud console.
As an alternative to the Google Cloud console, you can also use the
Google Cloud CLI, REST API, or
Cloud Storage client libraries.
Console features
The Google Cloud console provides the following features:
If you're new to Google Cloud, create an account to evaluate how
Cloud Storage performs in real-world
scenarios. New customers also get $300 in free credits to run, test, and
deploy workloads.
The Google Cloud console requires no setup or installation, and you can
access it directly in a browser. Depending on your use case, you access the
Google Cloud console using different URLs. If you are:
In this use case, a project owner gives you access to an individual bucket
within a larger project. The owner then sends you the bucket name which you
substitute into the URL. You are able to only work with objects in the
specified bucket. This is useful for users who don't have access to the full
project, but who need to access a bucket. When you access the URL, you
authenticate if you are not already signed in.
A variation of this use case is when a project owner grants All Users
permission to read objects in a bucket. This creates a bucket whose contents
are publicly readable. For more information, see Make data public.
In this use case, a project owner gives you access to single objects within
a bucket and sends you the URL to access the objects. When you access the URLs,
you are prompted to authenticate with a user account if you are not already
signed in.
Note that the form of this URL is different from the URL for objects that
are shared publicly. When you share a link publicly, the URL is of the
form: https://storage.googleapis.com/BUCKET_NAME/OBJECT_NAME.
This public URL does not require a recipient to authenticate with a valid
account and can be used for non-authenticated access to an object.
Filtering and sorting lists of buckets and objects
In the Google Cloud console list of buckets for a project, you can filter and
sort the buckets you see by using the Filter field.
To filter buckets, select the Sort and filter option and then
specify the column and value you want to filter by.
To filter buckets by bucket name prefix, select the
Filter by name prefix only option and then specify the bucket name prefix
you want to filter by.
To sort buckets, click arrow_upwardSort next to the name of the column you want to sort by.
In the Google Cloud console list of objects for a bucket, you can filter the
objects you see by using the Filter field.
To filter objects, select the Sort and filter option and then
specify the column and value you want to filter by.
To filter objects by object name prefix, select the
Filter by name prefix only option and then specify the object name prefix
you want to filter by.
To sort objects, click arrow_upwardSort next to the name of the column you want to sort by.
Filtering and sorting only applies to objects and folders in the current
path being displayed. For example, if you're viewing the top-level of a bucket,
filtering and sorting don't return objects contained in folders.
Showing and hiding columns
To show or hide columns for a list of buckets or objects, click
View column (view_column), then select
the columns you want to see or hide.
Cross-product integrations
The following integrations with other Google Cloud products are available
in the Objects tab of a bucket:
Large scale data transfers to and from the bucket using Storage Transfer Service
Storage Transfer Service is a service that lets you transfer large volumes of
data between your bucket and other storage options, such as
your on-premises file system, other buckets, or other cloud providers.
You can initiate a transfer by clicking the Transfer data drop-down in
the Objects tab, selecting either Transfer data in or
Transfer data out, and following the instructions.
Scanning the bucket for sensitive data using Sensitive Data Protection
Sensitive Data Protection is a service that lets you identify and
protect sensitive data in your buckets, such as credit card numbers, IP
IP addresses, and other forms of personally identifiable information (PII).
You can initiate a Sensitive Data Protection scan for a bucket by
clicking the Other services drop-down in the Objects tab, selecting
Inspect for sensitive data, and following the instructions. For a guide
to performing a Sensitive Data Protection scan on a bucket, see
Inspecting a Cloud Storage location.
Exporting data from the bucket to Pub/Sub
Pub/Sub is a messaging service that lets you notify
subscribers about events that occur for your Google Cloud resources.
Pub/Sub supports receiving event records that are stored as text
files in your bucket and publishing them to a Pub/Sub topic.
You can create an export job for a bucket by clicking the Other services
drop-down in the Objects tab, selecting Export data to Pub/Sub, and
following the instructions. For more information, see
Cloud Storage text to Pub/Sub (Batch) template.
Processing data in the bucket using Cloud Run functions
Cloud Run functions is a service that lets you specify code that should
run when certain events occur within the bucket. For example, you could
create a Java function that runs every time an object in the bucket is
deleted.
You can define a function for a bucket by clicking the Other services
drop-down in the Objects tab, selecting Process data, and following
the instructions.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-07 UTC."],[],[],null,["# Google Cloud console\n\nThe browser-based Google Cloud console tool lets you manage your Cloud Storage\nresources through a graphical interface. Use the Google Cloud console to\nmanage your data if you prefer using a user interface through the browser.\n\nThis page describes how to access the Google Cloud console and lists tasks\nin Cloud Storage that can be performed using the Google Cloud console.\nAs an alternative to the Google Cloud console, you can also use the\n[Google Cloud CLI](/sdk/gcloud), [REST API](/storage/docs/json_api), or\n[Cloud Storage client libraries](/storage/docs/reference/libraries).\n\nConsole features\n----------------\n\nThe Google Cloud console provides the following features:\n\n- Access to all your Google Cloud projects\n- Access to the [Cloud Shell](/shell/docs)\n- A customizable project dashboard, with an overview of Google Cloud resources, billing, and a filterable activity listing\n- Access to all Google Cloud APIs, with a dashboard specific to each API, and access to manage your resources\n- Links to Google Cloud starting points, news, and documentation\n\nThe Google Cloud console is used to perform a variety of tasks in\nCloud Storage, such as the following:\n\n- [Creating buckets](/storage/docs/creating-buckets#storage-create-bucket-console)\n- [Uploading objects to buckets](/storage/docs/uploading-objects) and [downloading objects from buckets](/storage/docs/downloading-objects)\n- [Filtering and sorting lists of buckets and objects](#sort-filter)\n- Creating and managing [folders](/storage/docs/folder-types)\n- [Controlling access to your data and resources](/storage/docs/access-control) by using Identity and Access Management (IAM) and ACLs\n- [Monitoring buckets and bandwidth usage](/storage/docs/monitoring)\n- Using Cloud Storage features like [Anywhere Cache](/storage/docs/anywhere-cache), [soft delete](/storage/docs/soft-delete), and [Object Versioning](/storage/docs/object-versioning)\n\nTry it for yourself\n-------------------\n\n\nIf you're new to Google Cloud, create an account to evaluate how\nCloud Storage performs in real-world\nscenarios. New customers also get $300 in free credits to run, test, and\ndeploy workloads.\n[Try Cloud Storage free](https://console.cloud.google.com/freetrial)\n\nAccess to the Google Cloud console\n----------------------------------\n\nThe Google Cloud console requires no setup or installation, and you can\naccess it directly in a browser. Depending on your use case, you access the\nGoogle Cloud console using different URLs. If you are:\n\nA user granted access to a project\n\n: Use: `https://console.cloud.google.com/`\n\n: [Go to the Google Cloud console](https://console.cloud.google.com/)\n\n: A current project owner can [give you access to the entire project](/iam/docs/granting-changing-revoking-access#grant-single-role), which\n applies equally to all buckets and objects defined in the project.\n\nA user granted access to a bucket\n\n: Use: `https://console.cloud.google.com/storage/browser/`\u003cvar translate=\"no\"\u003eBUCKET_NAME\u003c/var\u003e\n\n: In this use case, a project owner gives you access to an individual bucket\n within a larger project. The owner then sends you the bucket name which you\n substitute into the URL. You are able to only work with objects in the\n specified bucket. This is useful for users who don't have access to the full\n project, but who need to access a bucket. When you access the URL, you\n authenticate if you are not already signed in.\n\n: A variation of this use case is when a project owner grants **All Users**\n permission to read objects in a bucket. This creates a bucket whose contents\n are publicly readable. For more information, see [Make data public](/storage/docs/access-control/making-data-public).\n\nA user granted access to an object\n\n: Use: `https://console.cloud.google.com/storage/browser/_details/`\u003cvar translate=\"no\"\u003eBUCKET_NAME\u003c/var\u003e`/`\u003cvar translate=\"no\"\u003eOBJECT_NAME\u003c/var\u003e\n\n: In this use case, a project owner gives you access to single objects within\n a bucket and sends you the URL to access the objects. When you access the URLs,\n you are prompted to authenticate with a user account if you are not already\n signed in.\n\n: Note that the form of this URL is different from the URL for objects that\n are [shared publicly](/storage/docs/access-control/making-data-public). When you share a link publicly, the URL is of the\n form: `https://storage.googleapis.com/`\u003cvar translate=\"no\"\u003eBUCKET_NAME\u003c/var\u003e`/`\u003cvar translate=\"no\"\u003eOBJECT_NAME\u003c/var\u003e.\n This public URL does not require a recipient to authenticate with a valid\n account and can be used for non-authenticated access to an object.\n\nFiltering and sorting lists of buckets and objects\n--------------------------------------------------\n\nIn the Google Cloud console list of buckets for a project, you can filter and\nsort the buckets you see by using the **Filter** field.\n\n- To filter buckets, select the **Sort and filter** option and then\n specify the column and value you want to filter by.\n\n- To filter buckets by bucket name prefix, select the\n **Filter by name prefix only** option and then specify the bucket name prefix\n you want to filter by.\n\n- To sort buckets, click arrow_upward\n **Sort** next to the name of the column you want to sort by.\n\n| **Note:** Projects with more than 1,000 buckets might experience degraded filtering and sorting performance.\n\nIn the Google Cloud console list of objects for a bucket, you can filter the\nobjects you see by using the **Filter** field.\n\n- To filter objects, select the **Sort and filter** option and then\n specify the column and value you want to filter by.\n\n- To filter objects by object name prefix, select the\n **Filter by name prefix only** option and then specify the object name prefix\n you want to filter by.\n\n- To sort objects, click arrow_upward\n **Sort** next to the name of the column you want to sort by.\n\nFiltering and sorting only applies to objects and folders in the current\npath being displayed. For example, if you're viewing the top-level of a bucket,\nfiltering and sorting don't return objects contained in folders.\n| **Note:** Buckets with large numbers of objects and folders in the current path might experience degraded performance when sorted or filtered with criteria other than the object name prefix.\n\nShowing and hiding columns\n--------------------------\n\nTo show or hide columns for a list of buckets or objects, click\n**View column** (view_column), then select\nthe columns you want to see or hide.\n\nCross-product integrations\n--------------------------\n\nThe following integrations with other Google Cloud products are available\nin the **Objects** tab of a bucket:\n\n- **Large scale data transfers to and from the bucket using Storage Transfer Service**\n\n [Storage Transfer Service](/storage-transfer-service) is a service that lets you transfer large volumes of\n data between your bucket and other storage options, such as\n your on-premises file system, other buckets, or other cloud providers.\n\n You can initiate a transfer by clicking the **Transfer data** drop-down in\n the **Objects** tab, selecting either **Transfer data in** or\n **Transfer data out**, and following the instructions.\n- **Scanning the bucket for sensitive data using Sensitive Data Protection**\n\n [Sensitive Data Protection](/security/products/sensitive-data-protection) is a service that lets you identify and\n protect sensitive data in your buckets, such as credit card numbers, IP\n IP addresses, and other forms of personally identifiable information (PII).\n\n For a list of the types of data Sensitive Data Protection detects,\n see the [Infotype detector reference](/sensitive-data-protection/docs/infotypes-reference).\n\n You can initiate a Sensitive Data Protection scan for a bucket by\n clicking the **Other services** drop-down in the **Objects** tab, selecting\n **Inspect for sensitive data** , and following the instructions. For a guide\n to performing a Sensitive Data Protection scan on a bucket, see\n [Inspecting a Cloud Storage location](/sensitive-data-protection/docs/inspecting-storage#inspecting-gcs).\n- **Exporting data from the bucket to Pub/Sub**\n\n [Pub/Sub](/pubsub) is a messaging service that lets you notify\n subscribers about events that occur for your Google Cloud resources.\n Pub/Sub supports receiving event records that are stored as text\n files in your bucket and publishing them to a Pub/Sub topic.\n\n You can create an export job for a bucket by clicking the **Other services**\n drop-down in the **Objects** tab, selecting **Export data to Pub/Sub** , and\n following the instructions. For more information, see\n [Cloud Storage text to Pub/Sub (Batch) template](/dataflow/docs/guides/templates/provided/cloud-storage-to-pubsub).\n- **Processing data in the bucket using Cloud Run functions**\n\n [Cloud Run functions](/functions) is a service that lets you specify code that should\n run when certain events occur within the bucket. For example, you could\n create a Java function that runs every time an object in the bucket is\n deleted.\n\n You can define a function for a bucket by clicking the **Other services**\n drop-down in the **Objects** tab, selecting **Process data**, and following\n the instructions."]]