O Cloud Data Loss Prevention (Cloud DLP) agora faz parte da Proteção de dados sensíveis. O nome da API continua o mesmo: API Cloud Data Loss Prevention (DLP). Para saber mais sobre os serviços que compõem a Proteção de dados sensíveis, consulte Visão geral da Proteção de dados sensíveis.
Programe inspeção periódica de um
bucket do Cloud Storage, uma tabela do BigQuery ou um
tipo do Datastore. Para instruções detalhadas, consulte
Como criar e programar jobs de inspeção.
Guias de instruções
Nesta seção, fornecemos uma lista categorizada de guias baseados em tarefas que demonstram
como usar a Proteção de Dados Sensíveis com o Cloud Storage.
Crie e programe um gatilho de jobs que pesquisa dados confidenciais em um bucket do Cloud Storage, uma tabela do BigQuery ou um tipo do Datastore. Um gatilho de job automatiza a criação de jobs de Proteção de Dados Sensíveis periodicamente.
Implementar um sistema automatizado de quarentena e classificação de dados usando a Proteção de dados sensíveis, o Cloud Storage e o Cloud Run functions.
Contribuições da comunidade
Os itens a seguir são de propriedade e gerenciados pelos membros da comunidade, não pela equipe de proteção de dados sensíveis. No caso de perguntas sobre esses itens, entre em contato com os
respectivos proprietários.
[[["Fácil de entender","easyToUnderstand","thumb-up"],["Meu problema foi resolvido","solvedMyProblem","thumb-up"],["Outro","otherUp","thumb-up"]],[["Difícil de entender","hardToUnderstand","thumb-down"],["Informações incorretas ou exemplo de código","incorrectInformationOrSampleCode","thumb-down"],["Não contém as informações/amostras de que eu preciso","missingTheInformationSamplesINeed","thumb-down"],["Problema na tradução","translationIssue","thumb-down"],["Outro","otherDown","thumb-down"]],["Última atualização 2025-08-19 UTC."],[],[],null,["# Using Sensitive Data Protection with Cloud Storage\n\n\u003cbr /\u003e\n\nThis page contains references to pages that provide information on how\nto use Sensitive Data Protection with [Cloud Storage](/storage).\n\nQuickstart guides\n-----------------\n\n[Quickstart: Scheduling an inspection scan](/sensitive-data-protection/docs/schedule-inspection-scan)\n: Schedule periodic inspection of a\n Cloud Storage bucket, a BigQuery table, or a\n Datastore kind. For detailed instructions, see\n [Creating and scheduling inspection jobs](/sensitive-data-protection/docs/creating-job-triggers).\n\nHow-to guides\n-------------\n\nThis section provides a categorized list of task-based guides that demonstrate\nhow to use Sensitive Data Protection with Cloud Storage.\n\n### Inspection\n\n[Inspecting storage and databases for sensitive data](/sensitive-data-protection/docs/inspecting-storage)\n: Create a one-time job that searches for sensitive data in a Cloud Storage\n bucket, a BigQuery table, or a Datastore kind.\n\n[Creating and scheduling inspection jobs](/sensitive-data-protection/docs/creating-job-triggers)\n: Create and schedule a job trigger that searches for sensitive data in a\n Cloud Storage bucket, a BigQuery table, or a\n Datastore kind. A job trigger automates the creation of\n Sensitive Data Protection jobs on a periodic basis.\n\n### Working with scan results\n\n[Sending Sensitive Data Protection scan results to Security Command Center](/sensitive-data-protection/docs/sending-results-to-scc)\n: Scan a Cloud Storage bucket, a BigQuery table, or a\n Datastore kind, and then send the findings to Security Command Center.\n\n[Analyzing and reporting on Sensitive Data Protection findings](/sensitive-data-protection/docs/analyzing-and-reporting)\n: Use Cloud Storage to run analytics on Sensitive Data Protection\n findings.\n\nTutorials\n---------\n\n[De-identification and re-identification of PII in large-scale datasets using Sensitive Data Protection](/architecture/de-identification-re-identification-pii-using-cloud-dlp)\n: Create an automated data transformation pipeline to de-identify sensitive data\n like personally identifiable information (PII).\n\n[Automating the classification of data uploaded to Cloud Storage](/sensitive-data-protection/docs/automating-classification-of-data-uploaded-to-cloud-storage)\n: Implement an automated data quarantine and classification system using Sensitive Data Protection, Cloud Storage, and Cloud Run functions.\n\nCommunity contributions\n-----------------------\n\nThe following are owned and managed by community members, and not by the\nSensitive Data Protection team. For questions on these items, contact their\nrespective owners.\n\n[GitHub: Speech Redaction Framework](https://github.com/GoogleCloudPlatform/dataflow-speech-redaction)\n: Redact sensitive information from audio files in Cloud Storage.\n\n[GitHub: Speech Analysis Framework](https://github.com/GoogleCloudPlatform/dataflow-contact-center-speech-analysis)\n: Transcribe audio, create a data pipeline for analytics of transcribed audio\n files, and redact sensitive information from audio transcripts.\n\n[GitHub: Real-time anomaly detection using Google Cloud stream analytics and AI services](https://github.com/GoogleCloudPlatform/df-ml-anomaly-detection)\n: Walk through a real-time artificial intelligence (AI) pattern for detecting\n anomalies in log files.\n\nPricing\n-------\n\nWhen you inspect a Cloud Storage bucket, you incur\nSensitive Data Protection costs, according to the [storage inspection job pricing](/sensitive-data-protection/pricing#storage-pricing)."]]