[[["易于理解","easyToUnderstand","thumb-up"],["解决了我的问题","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["很难理解","hardToUnderstand","thumb-down"],["信息或示例代码不正确","incorrectInformationOrSampleCode","thumb-down"],["没有我需要的信息/示例","missingTheInformationSamplesINeed","thumb-down"],["翻译问题","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["最后更新时间 (UTC):2025-08-19。"],[],[],null,["# Enable inspection or risk analysis actions\n\nThis document describes the actions that Sensitive Data Protection can\nperform after running an inspection job or risk analysis.\n\nAn *action* is a task that Sensitive Data Protection performs after completing an\ninspection job or risk analysis. For example, you can save findings to a\nBigQuery table, publish a notification to a Pub/Sub topic, or\nsend an email when an operation either finishes successfully or stops on error.\n\nSensitive data discovery operations have a different set of actions. For more\ninformation about discovery actions, see [Enable discovery\nactions](/sensitive-data-protection/docs/enable-discovery-actions).\n\nAvailable actions\n-----------------\n\nWhen you run a Sensitive Data Protection job, a summary of its findings are saved by\ndefault within Sensitive Data Protection. You can see this summary using\n[Sensitive Data Protection in the Google Cloud console](https://console.cloud.google.com/security/sensitive-data-protection). You\ncan also retrieve summary information in the DLP API by using the\n[`projects.dlpJobs.get`](/sensitive-data-protection/docs/reference/rest/v2/projects.dlpJobs/get)\nmethod.\n\nThe following sections describe the actions that are available to inspection and\nrisk analysis jobs.\n\n### Save findings to BigQuery\n\nSave the Sensitive Data Protection job results to a\n[BigQuery](/bigquery/docs) table. Before viewing or analyzing the\nresults, first verify that the job has completed.\n\nEach time a scan runs, Sensitive Data Protection saves scan findings to the\nBigQuery table that you specify. The exported findings contain\ndetails about each finding's location and match likelihood.\n\n\nIf you want each finding to include the string that matched the infoType\ndetector, enable the **Include quote** option. Quotes are potentially\nsensitive, so Sensitive Data Protection doesn't include them in findings by\ndefault.\n\nIf you don't specify a table ID,\nBigQuery assigns a default name to a new table the first time\nthe scan runs. The name is similar to\n`dlp`*googleapis* \u003cvar translate=\"no\"\u003eDATE\u003c/var\u003e`_1234567890`, where\n\u003cvar translate=\"no\"\u003eDATE\u003c/var\u003e represents the date the scan is run. If you specify\nan existing table, Sensitive Data Protection appends scan findings to it.\n\n\nWhen [data is written\nto a BigQuery table](/bigquery/docs/reference/rest/v2/tabledata/insertAll), the billing and quota usage are applied to\nthe project that contains the destination table.\n\n\n| **Note:** If you don't save findings to BigQuery or Cloud Storage, the scan results only contain statistics about the number and infoTypes of the findings.\n\n\u003cbr /\u003e\n\n### Save findings to Cloud Storage\n\nSave the Sensitive Data Protection job results to an\nexisting [Cloud Storage](/storage/docs) bucket or folder. Before viewing\nor analyzing the results, first verify that the job has completed.\n\nIf you're inspecting a Cloud Storage bucket, the bucket that you\ndesignate for exported findings must not be the bucket that you're\ninspecting.\n\nEach time a scan runs, Sensitive Data Protection saves scan findings to the\nCloud Storage location that you specify. The exported findings\ncontain details about each finding's location and match likelihood.\n\n\nIf you want each finding to include the string that matched the infoType\ndetector, enable the **Include quote** option. Quotes are potentially\nsensitive, so Sensitive Data Protection doesn't include them in findings by\ndefault.\n\nThe findings are exported in Protobuf text format as a\n[`SaveToGcsFindingsOutput`](/sensitive-data-protection/docs/reference/rest/v2/SaveToGcsFindingsOutput)\nobject. For information about how to parse findings in this format, see [Parse\nfindings stored as Protobuf\ntext](/sensitive-data-protection/docs/parse-inspection-job-findings).\n\n\n| **Note:** If you don't save findings to BigQuery or Cloud Storage, the scan results only contain statistics about the number and infoTypes of the findings.\n\n\u003cbr /\u003e\n\n### Publish to Pub/Sub\n\nPublish a notification that contains the name of the Sensitive Data Protection\njob as an attribute to a\n[Pub/Sub](https://console.cloud.google.com/cloudpubsub) channel. You\ncan specify one or more topics to send the notification message to. Make sure that\nthe Sensitive Data Protection service account running the scan job has publishing\naccess on the topic.\n\n\nIf there are configuration or permission issues with the Pub/Sub topic,\nSensitive Data Protection retries sending the Pub/Sub notification for up to\ntwo weeks. After two weeks, the notification is discarded.\n\n### Publish to Security Command Center\n\nPublish a summary of the job results to Security Command Center. For more\ninformation, see\n[Send Sensitive Data Protection scan results to Security Command Center](/sensitive-data-protection/docs/sending-results-to-scc).\n\nTo use this action, your project must belong to an organization, and\nSecurity Command Center must be activated at the organization level. Otherwise,\nSensitive Data Protection findings won't appear in\nSecurity Command Center. For more information, see [Check the activation level of\nSecurity Command Center](/security-command-center/docs/activate-scc-overview#check-scc-activation-level).\n\n### Publish to Data Catalog\n\nSend job results to Data Catalog. This feature is\n[deprecated](/sensitive-data-protection/docs/deprecations).\n\n### Notify by email\n\nSend an email when the job completes. The email goes to IAM\nproject owners and technical\n[Essential Contacts](/resource-manager/docs/managing-notification-contacts).\n\n### Publish to Cloud Monitoring\n\nSend inspection results to Cloud Monitoring in\n[Google Cloud Observability](/products/operations).\n\n### Make a de-identified copy\n\nDe-identify any findings in the inspected data, and write the de-identified\ncontent to a new file. You can then use the de-identified copy in your business\nprocesses, in place of data that contains sensitive information. For more\ninformation, see [Create a de-identified copy of\nCloud Storage data using Sensitive Data Protection in the\nGoogle Cloud console](/sensitive-data-protection/docs/deidentify-storage-console).\n\nSupported operations\n--------------------\n\nThe following table shows the Sensitive Data Protection operations and where each\naction is available.\n\nSpecify actions\n---------------\n\nYou can specify one or more actions when you configure a job:\n\n- When you create a new inspection or risk analysis job using Sensitive Data Protection in the Google Cloud console, specify actions in the **Add actions** section of the job creation workflow.\n- When you configure a new job request to send to the DLP API, specify actions in the [`Action`](/sensitive-data-protection/docs/reference/rest/v2/Action) object.\n\nFor more information and sample code in several languages, see:\n\n- [Creating and scheduling inspection jobs](/sensitive-data-protection/docs/creating-job-triggers)\n- [Computing k-anonymity for a dataset](/sensitive-data-protection/docs/compute-k-anonymity)\n- [Computing l-diversity for a dataset](/sensitive-data-protection/docs/compute-l-diversity)\n\nExample action scenario\n-----------------------\n\nYou can use Sensitive Data Protection actions to automate processes based on\nSensitive Data Protection scan results. Suppose you have a BigQuery\ntable shared with an external partner. You want to ensure both that this table\ndoes not contain any sensitive identifiers like US Social Security numbers\n(the\n[infoType `US_SOCIAL_SECURITY_NUMBER`](/sensitive-data-protection/docs/infotypes-reference#united_states)),\nand that, if you find any, access is revoked from the partner. Here is a rough\noutline of a workflow that would use actions:\n\n1. [Create a Sensitive Data Protection job\n trigger](/sensitive-data-protection/docs/creating-job-triggers) to run an inspection scan of the BigQuery table every 24 hours.\n2. Set the [action](/sensitive-data-protection/docs/reference/rest/v2/InspectJobConfig#Action) of these jobs to publish a Pub/Sub notification to the topic \"projects/foo/scan_notifications.\"\n3. Create a [Cloud Function](/functions) that listens for incoming messages on \"projects/foo/scan_notifications.\" This Cloud Function will receive the name of the Sensitive Data Protection job every 24 hours, call Sensitive Data Protection to get summary results from this job, and, if it finds any Social Security numbers, it can change settings in BigQuery or [Identity and Access Management (IAM)](/iam) to restrict access to the table.\n\nWhat's next\n-----------\n\n- Learn about [the actions available with inspection\n jobs](/sensitive-data-protection/docs/creating-job-triggers#job-add-actions).\n- Learn about [the actions available with risk analysis\n jobs](/sensitive-data-protection/docs/compute-k-anonymity#compute-k).\n- Learn about [the actions available with sensitive data discovery\n operations](/sensitive-data-protection/docs/enable-discovery-actions)."]]