Sign in to your Google Cloud account. If you're new to
Google Cloud,
create an account to evaluate how our products perform in
real-world scenarios. New customers also get $300 in free credits to
run, test, and deploy workloads.
In the Google Cloud console, on the project selector page,
select or create a Google Cloud project.
[[["易于理解","easyToUnderstand","thumb-up"],["解决了我的问题","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["很难理解","hardToUnderstand","thumb-down"],["信息或示例代码不正确","incorrectInformationOrSampleCode","thumb-down"],["没有我需要的信息/示例","missingTheInformationSamplesINeed","thumb-down"],["翻译问题","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["最后更新时间 (UTC):2025-08-18。"],[[["\u003cp\u003eExclusion filters in Cloud Logging allow users to manage the volume of Dataflow logs ingested, reducing costs while maintaining detailed logs for debugging.\u003c/p\u003e\n"],["\u003cp\u003eThese filters use the Logging query language to specify which log entries to exclude from being ingested or routed to a sink, based on criteria like log severity and Dataflow job name.\u003c/p\u003e\n"],["\u003cp\u003eBy default, a sample exclusion filter can prevent \u003ccode\u003eDEFAULT\u003c/code\u003e, \u003ccode\u003eDEBUG\u003c/code\u003e, \u003ccode\u003eINFO\u003c/code\u003e, and \u003ccode\u003eNOTICE\u003c/code\u003e severity Dataflow logs from being captured, while retaining \u003ccode\u003eWARNING\u003c/code\u003e and higher severity logs.\u003c/p\u003e\n"],["\u003cp\u003eA bypass mechanism using \u003ccode\u003edebug\u003c/code\u003e in the Dataflow job name enables the capture of all log entries for specific jobs, useful for debugging failed jobs.\u003c/p\u003e\n"],["\u003cp\u003eUsers can create an external sink to route all Dataflow logs to destinations like BigQuery, Pub/Sub, or Splunk, providing more control over log storage and costs, and log based metrics can be used to track logs regardless if they are excluded or not.\u003c/p\u003e\n"]]],[],null,["# Control Dataflow log ingestion\n\n[Exclusion filters](/logging/docs/routing/overview#exclusions) let you\ncontrol the volume of Dataflow logs ingested by Cloud Logging while\nstill making verbose logging available for debugging. You can use exclusion\nfilters to exclude matching log entries from being ingested\nby Cloud Logging or from being routed to the destination of the [sink](/logging/docs/routing/overview#sinks).\nCreate exclusion filters by using the [Logging query language](/logging/docs/view/logging-query-language).\nLogging query language lets you specify a subset of all log entries in\nyour selected Google Cloud resource, such as a project or a folder.\n\nBy using exclusion filters, you can reduce the Cloud Logging\ncosts incurred by Dataflow log ingestion. For more information about log ingestion pricing for\nCloud Logging, see the [Cloud Logging pricing summary](/stackdriver/pricing).\nFor more details about how exclusion filters work and their limitations, see\n[Exclusion filters](/logging/docs/routing/overview#exclusions) in the\nCloud Logging documentation.\n\nDataflow jobs emit multiple [log types](/logging#log-types).\nThis page demonstrates how to filter Dataflow job logs and worker logs.\n\nCreate log exclusion filters\n----------------------------\n\nThis example creates an exclusion filter on the\n[`_Default` Cloud Logging sink](/logging/docs/routing/overview#sinks). The\nfilter excludes all `DEFAULT`, `DEBUG`, `INFO`, and `NOTICE` severity Dataflow\nlogs from being ingested into Cloud Logging. `WARNING`,\n`ERROR`, `CRITICAL`, `ALERT`, and `EMERGENCY` severity logs are still captured. For more information about\nsupported log levels, see\n[LogSeverity](/logging/docs/reference/v2/rest/v2/LogEntry#logseverity).\n\n### Before you begin\n\n- Sign in to your Google Cloud account. If you're new to Google Cloud, [create an account](https://console.cloud.google.com/freetrial) to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.\n- In the Google Cloud console, on the project selector page,\n select or create a Google Cloud project.\n\n | **Note**: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.\n\n [Go to project selector](https://console.cloud.google.com/projectselector2/home/dashboard)\n-\n [Verify that billing is enabled for your Google Cloud project](/billing/docs/how-to/verify-billing-enabled#confirm_billing_is_enabled_on_a_project).\n\n- In the Google Cloud console, on the project selector page,\n select or create a Google Cloud project.\n\n | **Note**: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.\n\n [Go to project selector](https://console.cloud.google.com/projectselector2/home/dashboard)\n-\n [Verify that billing is enabled for your Google Cloud project](/billing/docs/how-to/verify-billing-enabled#confirm_billing_is_enabled_on_a_project).\n\n### Permissions\n\nAs you get started, ensure the following:\n\n- You have a Google Cloud project with logs that you can see in the\n [Logs Explorer](/logging/docs/view/logs-explorer-summary).\n\n- You have one of the following IAM roles for the source\n Google Cloud project from which you're routing logs.\n\n - **Owner** (`roles/owner`)\n - **Logging Admin** (`roles/logging.admin`)\n - **Logs Configuration Writer** (`roles/logging.configWriter`)\n\n The permissions contained in these roles let you create, delete, or\n modify sinks. For information on setting IAM roles, see the\n Logging [Access control guide](/logging/docs/access-control).\n- You have a resource in a [supported destination](#supported-destinations) or\n can create one.\n\n You need to create the routing destination before the sink, through\n either Google Cloud CLI, Google Cloud console, or the Google Cloud\n APIs. You can create the destination in any Google Cloud project in any\n organization. Before you create the destination, make sure the service\n account from the sink has [permissions to write to the destination](#dest-auth).\n\n### Add an exclusion filter\n\nThe following steps demonstrate how to add a Cloud Logging exclusion filter\nto your Dataflow logs. This exclusion filter selects all Dataflow\nlog entries with the severity `DEFAULT`, `DEBUG`, `INFO`, and `NOTICE` from jobs\nthat have a Dataflow job name that does not end in the string `debug`. The filter\nexcludes these logs from ingestion into the `Default` Cloud Logging bucket.\n\n1. In the Google Cloud console, go to the **Logs Router** page:\n\n [Go to Logs Router](https://console.cloud.google.com/logs/router)\n2. Find the row with the `_Default` sink, expand the\n more_vert\n **Actions** option, and then click **Edit sink**.\n\n3. In **Choose logs to filter out of sink** , for **Build an exclusion filter** ,\n click add **Add exclusion**.\n\n4. Enter a name for your exclusion filter.\n\n5. In the **Build an exclusion filter** section, paste the following text into\n the box:\n\n resource.type=\"dataflow_step\" AND\n labels.\"dataflow.googleapis.com/job_name\"!~\".*debug\" AND\n severity=(DEFAULT OR DEBUG OR INFO OR NOTICE)\n\n - The first line selects all log entries generated by the Dataflow service.\n - The second line selects all log entries where the `job_name` field does not end with the string `debug`.\n - The third line selects all log entries with the severity `DEFAULT`, `DEBUG`, `INFO`, or `NOTICE`.\n6. Click **Update sink**.\n\nTest your exclusion filter\n--------------------------\n\nYou can verify that the filter is working correctly by running a sample\nDataflow job and then viewing the logs.\n\nAfter your job starts running, to view job logs, complete the following steps:\n\n1. In the Google Cloud console, go to the Dataflow **Jobs** page.\n\n [Go to Jobs](https://console.cloud.google.com/dataflow/jobs)\n\n A list of Dataflow jobs appears along with their status.\n2. Select a job.\n\n3. On the **Job details** page, in the **Logs** panel, click\n *segment* **Show**.\n\n4. Verify that no logs appear in the **Job logs** panel and that no `DEFAULT`,\n `DEBUG`, `INFO`, or `NOTICE` logs appear in the **Worker logs** panel.\n\nBypass the exclusion filter\n---------------------------\n\nThe Dataflow job name (`job_name`) is used to provide a bypass mechanism for\nscenarios where the generated Dataflow logs need to be captured. You can\nuse this bypass to rerun a failed job and capture all the log information.\n\nThe filter created in this scenario retains all log entries when the `job_name` field\nends with the string `debug`. When you want to bypass the exclusion filter and display all logs for a\nDataflow job, append `debug` to the job name. For example, to bypass the exclusion filter, you\ncould use the job name `dataflow-job-debug`.\n\nCompare log counts\n------------------\n\nIf you want to compare the volume of logs ingested with and without the exclusion\nfilter, run one job with `debug` appended to the job name and one without. Use\nthe system-defined, logs-based metric\n[Log bytes](/monitoring/api/metrics_gcp_i_o#logging/byte_count) to view and compare\nthe ingestion data. For more information about viewing ingestion data, see\n[View ingestion data in Metrics Explorer](/stackdriver/estimating-bills#metric-exp-usage).\n\nCreate an external destination\n------------------------------\n\nOptionally, after you create the exclusion filter, you can create an additional\nCloud Logging sink. Use this sink to redirect the complete set of\nDataflow logs into a\n[supported external destination](/logging/docs/routing/overview#destinations),\nsuch as BigQuery, Pub/Sub, or Splunk.\n\nIn this scenario, the external logs aren't stored in Logs Explorer but are\navailable in the external destination. Using an external destination gives you\nmore control over the costs incurred by storing logs in Logs Explorer.\n\nFor steps detailing how to control how Cloud Logging routes logs, see\n[Configure and manage sinks](/logging/docs/export/configure_export_v2).\nTo capture all Dataflow logs in an external destination,\nin the **Choose logs to include in sink** panel, in the **Build inclusion filter** field,\nenter the following filter expression: \n\n resource.type=\"dataflow_step\"\n\nTo find log entries that you routed from Cloud Logging to supported\ndestinations, see [View logs in sink destinations](/logging/docs/export/using_exported_logs).\n\nTrack Dataflow log messages by severity\n---------------------------------------\n\nExclusion filters do not apply to\n[user-defined logs-based metrics](/logging/docs/logs-based-metrics#user-metrics).\nThese metrics count the number of log entries that match a given filter or\nrecord particular values within the matching log entries.\nTo track counts of Dataflow log messages based on severity, you can\ncreate a logs-based metric for the Dataflow logs. The logs\nare tracked even when the log messages are excluded from ingestion.\n\nYou're billed for user-defined logs-based metrics. For pricing information, see\n[Chargeable metrics](/stackdriver/pricing#metrics-chargeable).\n\nTo configure user-defined logs-based metrics, see\n[Create a counter metric](/logging/docs/logs-based-metrics/counter-metrics#create_a_counter_metric).\nTo track the Dataflow logs, in the **Filter selection** section,\nin the **Build filter** box, enter the following text: \n\n resource.type=\"dataflow_step\""]]