Cloud Data Loss Prevention (Cloud DLP) is now a part of Sensitive Data Protection. The API name remains the same: Cloud Data Loss Prevention API (DLP API). For information about the services that make up Sensitive Data Protection, see Sensitive Data Protection overview.
Schedule periodic inspection of a
Cloud Storage bucket, a BigQuery table, or a
Datastore kind. For detailed instructions, see
Creating and scheduling inspection jobs.
How-to guides
This section provides a categorized list of task-based guides that demonstrate
how to use Sensitive Data Protection with Cloud Storage.
Create and schedule a job trigger that searches for sensitive data in a
Cloud Storage bucket, a BigQuery table, or a
Datastore kind. A job trigger automates the creation of
Sensitive Data Protection jobs on a periodic basis.
Implement an automated data quarantine and classification system using Sensitive Data Protection, Cloud Storage, and Cloud Run functions.
Community contributions
The following are owned and managed by community members, and not by the
Sensitive Data Protection team. For questions on these items, contact their
respective owners.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-28 UTC."],[],[],null,["# Using Sensitive Data Protection with Cloud Storage\n\n\u003cbr /\u003e\n\nThis page contains references to pages that provide information on how\nto use Sensitive Data Protection with [Cloud Storage](/storage).\n\nQuickstart guides\n-----------------\n\n[Quickstart: Scheduling an inspection scan](/sensitive-data-protection/docs/schedule-inspection-scan)\n: Schedule periodic inspection of a\n Cloud Storage bucket, a BigQuery table, or a\n Datastore kind. For detailed instructions, see\n [Creating and scheduling inspection jobs](/sensitive-data-protection/docs/creating-job-triggers).\n\nHow-to guides\n-------------\n\nThis section provides a categorized list of task-based guides that demonstrate\nhow to use Sensitive Data Protection with Cloud Storage.\n\n### Inspection\n\n[Inspecting storage and databases for sensitive data](/sensitive-data-protection/docs/inspecting-storage)\n: Create a one-time job that searches for sensitive data in a Cloud Storage\n bucket, a BigQuery table, or a Datastore kind.\n\n[Creating and scheduling inspection jobs](/sensitive-data-protection/docs/creating-job-triggers)\n: Create and schedule a job trigger that searches for sensitive data in a\n Cloud Storage bucket, a BigQuery table, or a\n Datastore kind. A job trigger automates the creation of\n Sensitive Data Protection jobs on a periodic basis.\n\n### Working with scan results\n\n[Sending Sensitive Data Protection scan results to Security Command Center](/sensitive-data-protection/docs/sending-results-to-scc)\n: Scan a Cloud Storage bucket, a BigQuery table, or a\n Datastore kind, and then send the findings to Security Command Center.\n\n[Analyzing and reporting on Sensitive Data Protection findings](/sensitive-data-protection/docs/analyzing-and-reporting)\n: Use Cloud Storage to run analytics on Sensitive Data Protection\n findings.\n\nTutorials\n---------\n\n[De-identification and re-identification of PII in large-scale datasets using Sensitive Data Protection](/architecture/de-identification-re-identification-pii-using-cloud-dlp)\n: Create an automated data transformation pipeline to de-identify sensitive data\n like personally identifiable information (PII).\n\n[Automating the classification of data uploaded to Cloud Storage](/sensitive-data-protection/docs/automating-classification-of-data-uploaded-to-cloud-storage)\n: Implement an automated data quarantine and classification system using Sensitive Data Protection, Cloud Storage, and Cloud Run functions.\n\nCommunity contributions\n-----------------------\n\nThe following are owned and managed by community members, and not by the\nSensitive Data Protection team. For questions on these items, contact their\nrespective owners.\n\n[GitHub: Speech Redaction Framework](https://github.com/GoogleCloudPlatform/dataflow-speech-redaction)\n: Redact sensitive information from audio files in Cloud Storage.\n\n[GitHub: Speech Analysis Framework](https://github.com/GoogleCloudPlatform/dataflow-contact-center-speech-analysis)\n: Transcribe audio, create a data pipeline for analytics of transcribed audio\n files, and redact sensitive information from audio transcripts.\n\n[GitHub: Real-time anomaly detection using Google Cloud stream analytics and AI services](https://github.com/GoogleCloudPlatform/df-ml-anomaly-detection)\n: Walk through a real-time artificial intelligence (AI) pattern for detecting\n anomalies in log files.\n\nPricing\n-------\n\nWhen you inspect a Cloud Storage bucket, you incur\nSensitive Data Protection costs, according to the [storage inspection job pricing](/sensitive-data-protection/pricing#storage-pricing)."]]