Stay organized with collections
Save and categorize content based on your preferences.
This page explains how to add, upgrade, and remove Cloud Storage buckets and
BigQuery datasets as assets in existing Dataplex Universal Catalog zones.
Overview
An asset maps to data stored in either Cloud Storage or BigQuery. You
can map data stored in separate Google Cloud projects as assets into a single
zone within a lake. You can attach existing Cloud Storage buckets or
BigQuery datasets to be managed from within the lake.
Most gcloud lakes commands require a location. You can specify
the location by using the --location flag.
Required roles
To remove assets, grant the IAM roles containing the permissions
dataplex.lakes.delete, dataplex.zones.delete, or
dataplex.assets.delete IAM permissions. The Dataplex Universal Catalog
specific roles/dataplex.admin and roles/dataplex.editor roles
can be used to grant these permissions.
To add assets, grant the IAM roles containing the permissions create -
dataplex.lakes.create, dataplex.zones.create, or dataplex.assets.create.
The roles/dataplex.admin and roles/dataplex.editor roles contain these
permissions.
You can also give permission to users or groups by using the roles/owner
and roles/editor legacy roles.
You must authorize the Dataplex Universal Catalog service on resources being
attached to the Dataplex Universal Catalog lake. The authorization is automatically and
implicitly granted for resources in the project in which the lake is created.
For other projects, authorize the Dataplex Universal Catalog service
on resources explicitly.
To attach a Cloud Storage bucket from another project to your lake, you
must grant the Dataplex Universal Catalog service account
(service-PROJECT_NUMBER@gcp-sa-dataplex.iam.gserviceaccount.com,
retrieved from the lake details page in the console) the Dataplex Universal Catalog
service account role (roles/dataplex.serviceAgent) in the project that
contains the bucket. This role provides the
Dataplex Universal Catalog service with the prerequisite administrator level role on the bucket so that
permissions can be set on the bucket itself.
Grant roles for BigQuery datasets
To attach a BigQuery dataset from another project to your lake,
you must grant the Dataplex Universal Catalog service account, the
BigQuery Administrator role on the dataset.
VPC Service Controls considerations
Dataplex Universal Catalog doesn't violate VPC Service Controls perimeters. Before
adding an asset to the lake, make sure that the underlying bucket or dataset is
in the same VPC Service Controls network as the lake.
If there is no overlap between the Dataplex Universal Catalog lake region
and one of the Cloud Storage buckets region, you can't add
the bucket to a zone in your lake.
To learn more about the region location of a
Cloud Storage asset and how Dataplex Universal Catalog handles the
location of a bucket when creating the publishing dataset, see
Regional resources.
To add an asset, follow these steps:
Console
In the Google Cloud console, go to the Dataplex Universal Catalog Lakes page.
Click the lake to which
you want to add a Cloud Storage bucket or BigQuery
dataset. The lake page opens.
On the Zones tab, click the name of the data zone to which
you want to add the asset. The Data zone page for that data zone
opens.
On the Assets tab, click + Add Assets. The Add assets page
opens.
Click Add an Asset.
In the Type field, and select either
BigQuery dataset or Cloud Storage bucket.
In the Display name field, enter a name for the new asset.
In the ID field, enter a unique ID for the asset.
Optional: Enter a Description.
In the Dataset or Bucket field (based on the type of your asset),
click Browse to find and select your Cloud Storage bucket or
BigQuery dataset.
Optional: If your asset type is Cloud Storage bucket and if you
want Dataplex Universal Catalog to manage the asset, then select the
Upgrade to Managed checkbox. If you choose this option, you don't
have to upgrade the asset separately. This option isn't available
for BigQuery datasets.
Click Continue.
Choose the rest of the parameter values. For more information about
security settings, see Lake security.
Click Submit.
Verify that you have returned to the data zone page, and that your new
asset appears in the assets list.
When the addition succeeds, the data zone automatically enters active
state. If it fails, then the data zone is rolled back to its previous
healthy state.
Upgrade a Cloud Storage bucket asset
When you add an asset of type Cloud Storage bucket,
Dataplex Universal Catalog automatically publishes BigQuery
external tables for the tables hosted in the
asset.
When you upgrade a Cloud Storage bucket asset,
Dataplex Universal Catalog removes the attached external tables and creates
BigLake tables.
BigLake tables support better fine-grained security,
including row-level, column-level, and dynamic data masking.
To upgrade a Cloud Storage bucket asset, follow these steps:
Console
In the Google Cloud console, go to the Dataplex Universal Catalog Lakes page.
Click the lake from which
you want to remove a Cloud Storage bucket or
BigQuery dataset. The lake page for that lake opens.
On the Zones tab, click the name of the data zone you
want to remove the Cloud Storage bucket or BigQuery
dataset from. The Data zone page for that data zone opens.
On the Assets tab, select the asset by checking the box to the left
of the asset name.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-28 UTC."],[[["\u003cp\u003eThis guide outlines how to manage Cloud Storage buckets and BigQuery datasets as assets within Dataplex zones, including adding, upgrading, and removing them.\u003c/p\u003e\n"],["\u003cp\u003eAdding assets requires specific IAM roles, such as \u003ccode\u003edataplex.assets.create\u003c/code\u003e, and may necessitate authorizing the Dataplex service on resources from different projects.\u003c/p\u003e\n"],["\u003cp\u003eCloud Storage bucket assets can be upgraded to use BigLake tables, offering enhanced security features, or downgraded back to external tables.\u003c/p\u003e\n"],["\u003cp\u003eAssets must be removed from a data zone or lake before they can be attached to a different one, but removing them does not delete the underlying data.\u003c/p\u003e\n"],["\u003cp\u003eDataplex does not violate VPC Service Controls, but underlying buckets or datasets must reside within the same network as the lake for assets to be added.\u003c/p\u003e\n"]]],[],null,["# Manage data assets in a lake\n\nThis page explains how to add, upgrade, and remove Cloud Storage buckets and\nBigQuery datasets as assets in existing Dataplex Universal Catalog zones.\n\nOverview\n--------\n\nAn asset maps to data stored in either Cloud Storage or BigQuery. You\ncan map data stored in separate Google Cloud projects as assets into a single\nzone within a lake. You can attach existing Cloud Storage buckets or\nBigQuery datasets to be managed from within the lake.\n| **Note:** Attached assets cannot be managed by another data zone or lake.\n\nBefore you begin\n----------------\n\n- If you haven't already, [create a lake](/dataplex/docs/create-lake) and a\n [zone](/dataplex/docs/add-zone) in that lake.\n\n- Most `gcloud lakes` commands require a location. You can specify\n the location by using the `--location` flag.\n\n### Required roles\n\n- To remove assets, grant the IAM roles containing the permissions\n `dataplex.lakes.delete`, `dataplex.zones.delete`, or\n `dataplex.assets.delete` IAM permissions. The Dataplex Universal Catalog\n specific `roles/dataplex.admin` and `roles/dataplex.editor` roles\n can be used to grant these permissions.\n\n- To add assets, grant the IAM roles containing the permissions `create` -\n `dataplex.lakes.create`, `dataplex.zones.create`, or `dataplex.assets.create`.\n The `roles/dataplex.admin` and `roles/dataplex.editor` roles contain these\n permissions.\n\n- You can also give permission to users or groups by using the `roles/owner`\n and `roles/editor` legacy roles.\n\n- You must authorize the Dataplex Universal Catalog service on resources being\n attached to the Dataplex Universal Catalog lake. The authorization is automatically and\n implicitly granted for resources in the project in which the lake is created.\n For other projects, authorize the Dataplex Universal Catalog service\n on resources explicitly.\n\nFor more information, see [Dataplex Universal Catalog IAM and access control](/dataplex/docs/iam-and-access-control).\n\n#### Grant roles for Cloud Storage buckets\n\nTo attach a Cloud Storage bucket from another project to your lake, you\nmust grant the Dataplex Universal Catalog service account\n(`service-`\u003cvar translate=\"no\"\u003ePROJECT_NUMBER\u003c/var\u003e`@gcp-sa-dataplex.iam.gserviceaccount.com`,\nretrieved from the lake details page in the console) the Dataplex Universal Catalog\nservice account role (`roles/dataplex.serviceAgent`) in the project that\ncontains the bucket. This role provides the\nDataplex Universal Catalog service with the prerequisite administrator level role on the bucket so that\npermissions can be set on the bucket itself.\n\n#### Grant roles for BigQuery datasets\n\nTo attach a BigQuery dataset from another project to your lake,\nyou must grant the Dataplex Universal Catalog service account, the\nBigQuery Administrator role on the dataset.\n\n### VPC Service Controls considerations\n\nDataplex Universal Catalog doesn't violate VPC Service Controls perimeters. Before\nadding an asset to the lake, make sure that the underlying bucket or dataset is\nin the same VPC Service Controls network as the lake.\n\nFor more information, See [VPC Service Controls with\nDataplex Universal Catalog](/dataplex/docs/vpc-sc).\n\nAdd an asset\n------------\n\n| **Note:** You can create multiple assets in a data zone concurrently. You can continue to use the data zone while the asset is being added.\n\nIf there is no overlap between the Dataplex Universal Catalog lake region\nand one of the Cloud Storage buckets region, you can't add\nthe bucket to a zone in your lake.\n\nTo learn more about the region location of a\nCloud Storage asset and how Dataplex Universal Catalog handles the\nlocation of a bucket when creating the publishing dataset, see\n[Regional resources](/dataplex/docs/regional-resources).\n\nTo add an asset, follow these steps: \n\n### Console\n\n1. In the Google Cloud console, go to the Dataplex Universal Catalog page.\n\n [Go to Dataplex Universal Catalog](https://console.cloud.google.com/dataplex/lakes)\n2. On the **Manage** page, click the lake to which\n you want to add a Cloud Storage bucket or BigQuery\n dataset. The lake page opens.\n\n3. On the **Zones** tab, click the name of the data zone to which\n you want to add the asset. The Data zone page for that data zone\n opens.\n\n4. On the **Assets** tab, click **+ Add Assets** . The **Add assets** page\n opens.\n\n5. Click **Add an Asset**.\n\n6. In the **Type** field, and select either\n **BigQuery dataset** or **Cloud Storage bucket**.\n\n7. In the **Display name** field, enter a name for the new asset.\n\n8. In the **ID** field, enter a unique ID for the asset.\n\n9. Optional: Enter a **Description**.\n\n10. In the **Dataset** or **Bucket** field (based on the type of your asset),\n click **Browse** to find and select your Cloud Storage bucket or\n BigQuery dataset.\n\n11. Optional: If your asset type is **Cloud Storage bucket** and if you\n want Dataplex Universal Catalog to manage the asset, then select the\n **Upgrade to Managed** checkbox. If you choose this option, you don't\n have to upgrade the asset separately. This option isn't available\n for BigQuery datasets.\n\n12. Click **Continue**.\n\n13. Choose the rest of the parameter values. For more information about\n security settings, see [Lake security](/dataplex/docs/lake-security).\n\n14. Click **Submit**.\n\n15. Verify that you have returned to the data zone page, and that your new\n asset appears in the assets list.\n\n### REST\n\nTo add an asset, use the\n[lakes.zones.assets.create](/dataplex/docs/reference/rest/v1/projects.locations.lakes.zones.assets/create)\nmethod.\n\nWhen the addition succeeds, the data zone automatically enters active\nstate. If it fails, then the data zone is rolled back to its previous\nhealthy state.\n\nUpgrade a Cloud Storage bucket asset\n------------------------------------\n\nWhen you add an asset of type Cloud Storage bucket,\nDataplex Universal Catalog automatically publishes BigQuery\n[external tables](/bigquery/docs/external-tables) for the tables hosted in the\nasset.\n\nWhen you [upgrade a Cloud Storage bucket asset](/dataplex/docs/lake-security#upgrade),\nDataplex Universal Catalog removes the attached external tables and creates\n[BigLake tables](/bigquery/docs/biglake-intro).\nBigLake tables support better fine-grained security,\nincluding row-level, column-level, and dynamic data masking.\n| **Note:** Unstructured data of type `Fileset` in Cloud Storage buckets, which are marked as `Managed` are published as [BigQuery object tables](/bigquery/docs/object-table-introduction).\n\nTo upgrade a Cloud Storage bucket asset, follow these steps: \n\n### Console\n\n1. In the Google Cloud console, go to the Dataplex Universal Catalog page.\n\n [Go to Dataplex Universal Catalog](https://console.cloud.google.com/dataplex/lakes)\n2. On the **Manage** page, click the name of the lake. The lake page opens.\n\n3. On the **Zones** tab, click the name of the data zone. The\n data zone page opens.\n\n4. On the **Assets** tab, click the name of the asset that you want to\n upgrade.\n\n5. Click **Upgrade to Managed**.\n\n### REST\n\nTo upgrade a bucket asset, use the\n[lakes.zones.assets.patch](/dataplex/docs/reference/rest/v1/projects.locations.lakes.zones.assets/patch)\nmethod.\n\nDowngrade a Cloud Storage bucket asset\n--------------------------------------\n\nWhen you [downgrade a Cloud Storage bucket asset](/dataplex/docs/lake-security#upgrade),\nDataplex Universal Catalog removes the attached\n[BigLake tables](/bigquery/docs/biglake-intro) and creates\nexternal tables. \n\n### Console\n\n1. In the Google Cloud console, go to the Dataplex Universal Catalog page.\n\n [Go to Dataplex Universal Catalog](https://console.cloud.google.com/dataplex/lakes)\n2. On the **Manage** page, click the name of the lake. The lake page opens.\n\n3. On the **Zones** tab, click the name of the data zone. The\n data zone page opens.\n\n4. On the **Assets** tab, click the name of the asset that you want to\n upgrade.\n\n5. Click **Downgrade from Managed**.\n\n### REST\n\nTo downgrade a bucket asset, use the\n[lakes.zones.assets.patch](/dataplex/docs/reference/rest/v1/projects.locations.lakes.zones.assets/patch)\nmethod. Make sure that you set the `readAccessMode` field to `DIRECT` in\n[`ResourceSpec`](/dataplex/docs/reference/rest/v1/projects.locations.lakes.zones.assets#resourcespec).\n\nRemove an asset\n---------------\n\nRemove the asset from the data zone or lake before attaching it to a\ndifferent one.\n| **Note:** Your Cloud Storage bucket isn't deleted when you remove it from your data zone or lake. You must explicitly delete it, if required.\n\nTo remove an asset, follow these steps: \n\n### Console\n\n1. In the Google Cloud console, go to the Dataplex Universal Catalog page.\n\n [Go to Dataplex Universal Catalog](https://console.cloud.google.com/dataplex/lakes)\n2. On the **Manage** page, click the lake from which\n you want to remove a Cloud Storage bucket or\n BigQuery dataset. The lake page for that lake opens.\n\n3. On the **Zones** tab, click the name of the data zone you\n want to remove the Cloud Storage bucket or BigQuery\n dataset from. The Data zone page for that data zone opens.\n\n4. On the **Assets** tab, select the asset by checking the box to the left\n of the asset name.\n\n5. Click **Delete Asset**.\n\n6. On the confirmation dialog, click **Delete**.\n\n### REST\n\nTo remove a bucket, use the\n[lakes.zones,assets.delete](/dataplex/docs/reference/rest/v1/projects.locations.lakes.zones.assets/delete)\nmethod.\n\nWhat's next\n-----------\n\n- Learn more about [discovering data](/dataplex/docs/discover-data).\n- Learn how to [create a lake](/dataplex/docs/create-lake).\n- Learn more about [Cloud Audit Logs](/dataplex/docs/audit-logging)."]]