[[["易于理解","easyToUnderstand","thumb-up"],["解决了我的问题","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["很难理解","hardToUnderstand","thumb-down"],["信息或示例代码不正确","incorrectInformationOrSampleCode","thumb-down"],["没有我需要的信息/示例","missingTheInformationSamplesINeed","thumb-down"],["翻译问题","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["最后更新时间 (UTC):2025-09-04。"],[],[],null,["# Using Sensitive Data Protection with BigQuery\n\n\u003cbr /\u003e\n\nThis page contains references to pages that provide information on how\nto use Sensitive Data Protection with [BigQuery](/bigquery).\n\nQuickstart guides\n-----------------\n\n[Quickstart: Scheduling a Sensitive Data Protection inspection scan](/sensitive-data-protection/docs/schedule-inspection-scan)\n: Schedule periodic inspection of a\n Cloud Storage bucket, a BigQuery table, or a\n Datastore kind. For detailed instructions, see\n [Creating and scheduling Sensitive Data Protection inspection jobs](/sensitive-data-protection/docs/creating-job-triggers).\n\nHow-to guides\n-------------\n\nThis section provides a categorized list of task-based guides that demonstrate\nhow to use Sensitive Data Protection with BigQuery.\n\n### Inspection\n\n[Inspecting storage and databases for sensitive data](/sensitive-data-protection/docs/inspecting-storage)\n: Create a one-time job that searches for sensitive data in a Cloud Storage\n bucket, a BigQuery table, or a Datastore kind.\n\n[Creating and scheduling Sensitive Data Protection inspection jobs](/sensitive-data-protection/docs/creating-job-triggers)\n: Create and schedule a job trigger that searches for sensitive data in a\n Cloud Storage bucket, a BigQuery table, or a\n Datastore kind. A job trigger automates the creation of\n Sensitive Data Protection jobs on a periodic basis.\n\n### Working with scan results\n\n[Sending Sensitive Data Protection scan results to Data Catalog](/sensitive-data-protection/docs/sending-results-to-dc)\n: Scan a BigQuery table, and then send the findings to\n Data Catalog to automatically create tags based on\n Sensitive Data Protection findings.\n\n[Sending Sensitive Data Protection scan results to Security Command Center](/sensitive-data-protection/docs/sending-results-to-scc)\n: Scan a Cloud Storage bucket, a BigQuery table, or a\n Datastore kind, and then send the findings to Security Command Center.\n\n[Analyzing and reporting on Sensitive Data Protection findings](/sensitive-data-protection/docs/analyzing-and-reporting)\n: Use BigQuery to run analytics on Sensitive Data Protection\n findings.\n\n[Querying Sensitive Data Protection findings in BigQuery](/sensitive-data-protection/docs/querying-findings)\n: Look through sample queries that you can use in BigQuery to\n analyze findings that Sensitive Data Protection identified.\n\n### Re-identification risk analysis\n\n[Measuring re-identification and disclosure risk](/sensitive-data-protection/docs/compute-risk-analysis)\n\n: Analyze structured data stored in a BigQuery table and compute\n the following re-identification risk metrics:\n\n - [*k*-anonymity](/sensitive-data-protection/docs/compute-k-anonymity)\n - [*l*-diversity](/sensitive-data-protection/docs/compute-l-diversity)\n - [*k*-map](/sensitive-data-protection/docs/compute-k-map)\n - [*δ*-presence](/sensitive-data-protection/docs/compute-d-presence)\n\n[Computing numerical and categorical statistics](/sensitive-data-protection/docs/compute-stats)\n\n: Determine minimum, maximum, and quantile values for an individual\n BigQuery column.\n\n[Visualizing re-identification risk using Looker Studio](/sensitive-data-protection/docs/visualizing_re-id_risk)\n\n: Measure the *k*-anonymity of a dataset, and then visualize it in\n Looker Studio.\n\nTutorials\n---------\n\n[De-identify BigQuery data at query time](/sensitive-data-protection/docs/deidentify-bq-tutorial)\n: Follow a step-by-step tutorial that uses BigQuery remote\n functions to de-identify and re-identify data in real-time query results.\n\n[De-identification and re-identification of PII in large-scale datasets using Sensitive Data Protection](/architecture/de-identification-re-identification-pii-using-cloud-dlp)\n: Review a reference architecture for creating an automated data transformation\n pipeline that de-identifies sensitive data like personally identifiable\n information (PII).\n\nBest practices\n--------------\n\n[Secure a BigQuery data warehouse that stores confidential data](/architecture/confidential-data-warehouse-blueprint)\n: Architectural overview and best practices for data governance when creating,\n deploying, and operating a data warehouse in Google Cloud, including data\n de-identification, differential handling of confidential data, and\n column-level access controls.\n\nCommunity contributions\n-----------------------\n\nThe following are owned and managed by community members, and not by the\nSensitive Data Protection team. For questions on these items, contact their\nrespective owners.\n\n[Create Data Catalog tags by inspecting BigQuery data with Sensitive Data Protection](/community/tutorials/dlp-to-datacatalog-tags)\n: Inspect BigQuery data using the Cloud Data Loss Prevention API, and then use the\n Data Catalog API to create column-level tags according to the sensitive\n elements that Sensitive Data Protection found.\n\n[Event-driven serverless scheduling architecture with Sensitive Data Protection](/community/tutorials/event-driven-serverless-scheduling-framework-dlp)\n: Set up an event-driven, serverless scheduling application that uses the\n Cloud Data Loss Prevention API to inspect BigQuery data.\n\n[Real-time anomaly detection using Google Cloud stream analytics and AI services](https://github.com/GoogleCloudPlatform/df-ml-anomaly-detection)\n: Walk through a real-time artificial intelligence (AI) pattern for detecting\n anomalies in log files. This proof-of-concept uses Pub/Sub,\n Dataflow, BigQuery ML, and Sensitive Data Protection.\n\n[Relational database import to BigQuery with Dataflow and Sensitive Data Protection](https://github.com/GoogleCloudPlatform/dlp-rdb-bq-import)\n: Use Dataflow and Sensitive Data Protection to securely tokenize and import\n data from a relational database to BigQuery. This example\n describes how to tokenize PII data before it's made persistent.\n\nPricing\n-------\n\nWhen you inspect a BigQuery table, you incur\nSensitive Data Protection costs, according to the [storage inspection job pricing](/sensitive-data-protection/pricing#storage-pricing).\n\nIn addition, when you save inspection findings to a BigQuery\ntable, [BigQuery charges](/bigquery/pricing#data_ingestion_pricing)\napply."]]