[[["易于理解","easyToUnderstand","thumb-up"],["解决了我的问题","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["很难理解","hardToUnderstand","thumb-down"],["信息或示例代码不正确","incorrectInformationOrSampleCode","thumb-down"],["没有我需要的信息/示例","missingTheInformationSamplesINeed","thumb-down"],["翻译问题","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["最后更新时间 (UTC):2025-08-19。"],[],[],null,["# Recommended strategies for mitigating data risk\n\nThis page provides recommended strategies for identifying and remediating data\nrisk in your organization.\n\nProtecting your data starts with understanding what data you are handling, where\nsensitive data is located, and how this data is secured and used. When you have\na comprehensive view of your data and its security posture, you can take the\nappropriate measures to protect it and continuously monitor for compliance and\nrisk.\n\nThis page assumes that you're familiar with the\n[discovery and inspection services and their\ndifferences](/sensitive-data-protection/docs/learn-about-your-data).\n\nEnable sensitive data discovery\n-------------------------------\n\nTo determine where sensitive data exists in your business, configure\n[discovery](/sensitive-data-protection/docs/data-profiles) at the organization, folder, or\nproject level. This service generates data profiles containing [metrics and\ninsights](/sensitive-data-protection/docs/metrics-reference) about your data, including their\nsensitivity levels and data risk levels.\n\nAs a service, discovery acts as a source of truth about your data\nassets and can automatically report metrics for audit reports. Additionally,\ndiscovery can connect to other Google Cloud services like\nSecurity Command Center, Google Security Operations, and\nDataplex Universal Catalog to enrich security operations and data management.\n\nThe discovery service runs continuously and detects new data as your\norganization operates and grows. For example, if someone in your organization\ncreates a new project and uploads a large amount of new data, the\ndiscovery service can discover, classify, and report on the new data\nautomatically.\n\nSensitive Data Protection provides a [premade multi-page Looker\nreport](/sensitive-data-protection/docs/analyze-data-profiles) that gives you a high-level\nview of your data, including breakdowns by risk, by infoType, and by\nlocation. In the following example, the report shows that low-sensitivity and\nhigh-sensitivity data are present in multiple countries around the world.\n\nTake action based on discovery results\n--------------------------------------\n\nAfter you gain a broad view of your data security posture, you can remediate any\nissues found. In general, discovery findings fall into one of the following\nscenarios:\n\n- Scenario 1: Sensitive data was found in a workload where it's expected and it's properly protected.\n- Scenario 2: Sensitive data was found in a workload where it was not expected or where it does not have proper controls in place.\n- Scenario 3: Sensitive data was found but needs more investigation.\n\n### Scenario 1: Sensitive data was found and is properly protected\n\nAlthough this scenario doesn't require a specific action, you should include the\ndata profiles in your audit reports and security analysis workflows and continue\nto monitor for changes that can put your data at risk.\n\nWe recommend doing the following:\n\n- Publish the data profiles to tools for monitoring your security posture and\n investigating cyber threats. Data profiles can help you determine the severity\n of a security threat or vulnerability that might put your sensitive data at\n risk. You can automatically export data profiles to the following:\n\n - [Security Command Center](/sensitive-data-protection/docs/send-profiles-to-scc)\n - [Google SecOps](/chronicle/docs/ingestion/cloud/ingest-gcp-logs#export-dlp)\n- Publish the data profiles to Dataplex Universal Catalog or an inventory system to track\n the data profile metrics along with any other appropriate business metadata.\n For information about automatically exporting data profiles to\n Dataplex Universal Catalog, see\n [Add Dataplex Universal Catalog aspects based on insights from data profiles](/sensitive-data-protection/docs/add-aspects).\n\n### Scenario 2: Sensitive data was found and isn't properly protected\n\nIf discovery finds sensitive data in a resource that isn't properly\nsecured by access controls, then consider the recommendations described in this\nsection.\n\nAfter you establish the correct controls and data security posture for\nyour data, monitor for any changes that can put your data at risk. See the\n[recommendations in scenario 1](#sensitive-data-protected).\n\n#### General recommendations\n\nConsider doing the following:\n\n- Make a [de-identified copy of your\n data](/sensitive-data-protection/docs/inspect-sensitive-text-de-identify) to mask or\n tokenize the sensitive columns so that your data analysts and engineers can\n still work with your data without revealing raw, sensitive identifiers like\n personally identifiable information (PII).\n\n For Cloud Storage data, you can use a built-in feature in\n Sensitive Data Protection to [make de-identified\n copies](/sensitive-data-protection/docs/concepts-deidentify-storage).\n- If you don't need the data, then consider deleting it.\n\n#### Recommendations for protecting BigQuery data\n\n- [Adjust table-level permissions using\n IAM](/bigquery/docs/control-access-to-resources-iam#revoke_access_to_a_table_or_view).\n- [Set fine-grained column-level access controls by using BigQuery\n policy tags](/bigquery/docs/column-level-security-intro) to restrict access to\n the sensitive and high-risk columns. This feature lets you protect those\n columns while allowing access to the rest of the table.\n\n You can also use policy tags to enable automatic [data\n masking](/bigquery/docs/column-data-masking-intro), which can give users\n partially obfuscated data.\n- Use the [row-level security](/bigquery/docs/row-level-security-intro) feature\n of BigQuery to hide or display certain rows of data, depending\n on whether a user or group is in an allowed list.\n\n- [De-identify BigQuery data at query time with remote functions\n (UDF)](/sensitive-data-protection/docs/deidentify-bq-tutorial).\n\n#### Recommendations for protecting Cloud Storage data\n\n- [Apply access controls using\n IAM](/storage/docs/access-control/iam).\n\n- [Apply access controls using access control\n lists](/storage/docs/access-control/create-manage-lists).\n\n- [Make de-identified copies of your Cloud Storage\n data](/sensitive-data-protection/docs/concepts-deidentify-storage).\n\n### Scenario 3: Sensitive data was found but needs more investigation\n\nIn some cases, you might get results that require more investigation. For\nexample, a data profile might specify that a column has a high free-text score\nwith evidence of sensitive data. A high free-text score indicates that the data\ndoesn't have a predictable structure and might contain intermittent instances of\nsensitive data. This might be a column of notes where certain rows contain PII,\nsuch as names, contact details, or government-issued identifiers. In this case,\nwe recommend that you set additional access controls on the table and perform\nother remediations described in [scenario 2](#sensitive-data-not-protected). In\naddition, we recommend running a deeper, targeted inspection to identify the\nextent of the risk.\n\nThe inspection service lets you run a thorough scan of a single\nresource, such as an individual BigQuery table or a Cloud Storage\nbucket. For data sources that are not directly supported by the\ninspection service, you can export the data into a Cloud Storage a\nbucket or BigQuery table and run an inspection job on that\nresource. For example, if you have data that you need to inspect in a\nCloud SQL database, you can export that data to a CSV or AVRO file in\nCloud Storage and run an inspection job.\n\nAn inspection job locates individual instances of sensitive data, such as a\ncredit card number in the middle of a sentence inside a table cell. This level\nof detail can help you understand what kind of data is present in unstructured\ncolumns or in data objects, including text files, PDFs, images, and other rich\ndocument formats. You can then remediate your findings through any of the\nrecommendations described in [scenario 2](#sensitive-data-not-protected).\n\nIn addition to the steps recommended in scenario 2, consider taking steps to\nprevent sensitive information from entering your backend data storage.\nThe [`content` methods](/sensitive-data-protection/docs/reference/rest/v2/projects.locations.content)\nof the Cloud Data Loss Prevention API can accept data from any workload or application\nfor in-motion data inspection and masking. For example, your application can do\nthe following:\n\n1. Accept a user-provided comment.\n2. Run [`content.deidentify`](/sensitive-data-protection/docs/reference/rest/v2/projects.locations.content/deidentify) to de-identify any sensitive data from that string.\n3. Save the de-identified string to your backend storage instead of the original string.\n\nSummary of best practices\n-------------------------\n\nThe following table summarizes the best practices recommended in this document:"]]