This document explains how to create and manage sinks, which route log entries that originate in a Google Cloud project, to supported destinations.
When the destination of a sink isn't a log bucket in the Google Cloud project in which a log entry originates, a service account is required. Cloud Logging automatically creates and manages this service account; however, you might need to modify the permissions granted to the service account. You can create and manage a service account that is used by sinks in multiple projects. For more information, see Configure log sinks with user-managed service accounts.
Overview
Sinks determine how Cloud Logging routes log entries. By using sinks, you can route some or all of your log entries to the following destinations:
Cloud Logging bucket: Provides storage in Cloud Logging. A log bucket can store log entries that are received by multiple Google Cloud projects. The log bucket can be in the same project in which log entries originate, or in a different project. For information about viewing log entries stored in log buckets, see Query and view logs overview and View logs routed to Cloud Logging buckets.
You can combine your Cloud Logging data with other data by upgrading a log bucket to use Log Analytics, and then creating a linked dataset, which is a read-only dataset that can be queried by the BigQuery Studio and Looker Studio pages.
BigQuery dataset: Provides storage of log entries in a writeable BigQuery dataset. The BigQuery dataset can be in the same project in which log entries originate, or in a different project. You can use big data analysis capabilities on the stored log entries. For information about viewing log entries routed to BigQuery, see View logs routed to BigQuery.
- Cloud Storage bucket: Provides storage of log entries in Cloud Storage. The Cloud Storage bucket can be in the same project in which log entries originate, or in a different project. Log entries are stored as JSON files. For information about viewing log entries routed to Cloud Storage, see View logs routed to Cloud Storage.
Pub/Sub topic: Provides support for third-party integrations. Log entries are formatted into JSON and then routed to a Pub/Sub topic. The topic can be in the same project in which log entries originate, or in a different project. For information about viewing log entries routed to Pub/Sub, see View logs routed to Pub/Sub.
Google Cloud project: Route log entries to another Google Cloud project. In this configuration, the sinks in the destination project processes the log entries.
Sinks belong to a given Google Cloud resource: a Google Cloud project, a billing account, a folder, or an organization. When the resource receives a log entry, every sink in the resource processes the log entry. When a log entry matches the filters of the sink, then the log entry is routed to the sink's destination.
Typically, sinks only route the log entries that originate in a resource. However, for folders and organizations you can create aggregated sinks, which route log entries from the folder or organization, and the resources it contains. This document doesn't discuss aggregated sinks. For more information, see Collate and route organization-level logs to supported destinations.
To create and manage sinks, you can use the Google Cloud console, the Cloud Logging API, and the Google Cloud CLI. We recommend that you use the Google Cloud console:
- The Logs Router page lists all sinks and provides options to manage your sinks.
- When creating a sink, you can preview which log entries are matched by the sink's filters.
- You can configure sink destinations when creating a sink.
- Some authorization steps are completed for you.
Before you begin
The instructions in this document describe creating and managing sinks at the Google Cloud project level. You can use the same procedure to create a sink that routes log entries that originate in an organization, folder, or billing account.
To get started, do the following:
-
Enable the Cloud Logging API.
Ensure that your Google Cloud project contains log entries that you can see in the Logs Explorer.
-
To get the permissions that you need to create, modify, or delete a sink, ask your administrator to grant you the Logs Configuration Writer (
roles/logging.configWriter
) IAM role on your project. For more information about granting roles, see Manage access to projects, folders, and organizations.You might also be able to get the required permissions through custom roles or other predefined roles.
For information about granting IAM roles, see the Logging Access control guide.
You have a resource in a supported destination or have the ability to create one.
To route log entries to a destination, the destination must exist before you create the sink. You can create the destination in any Google Cloud project in any organization.
Before you create a sink, review the limitations that apply for the sink destination. For more information, see the Destination limitations section in this document.
Select the tab for how you plan to use the samples on this page:
Console
When you use the Google Cloud console to access Google Cloud services and APIs, you don't need to set up authentication.
gcloud
In the Google Cloud console, activate Cloud Shell.
At the bottom of the Google Cloud console, a Cloud Shell session starts and displays a command-line prompt. Cloud Shell is a shell environment with the Google Cloud CLI already installed and with values already set for your current project. It can take a few seconds for the session to initialize.
REST
To use the REST API samples on this page in a local development environment, you use the credentials you provide to the gcloud CLI.
Install the Google Cloud CLI, then initialize it by running the following command:
gcloud init
For more information, see Authenticate for using REST in the Google Cloud authentication documentation.
Create a sink
Following are the instructions for creating a sink in a Google Cloud project. You can use the same procedure to route log entries that originate in an organization, folder, or billing account:
- You can create up to 200 sinks per Google Cloud project.
- Don't put sensitive information in sink filters. Sink filters are treated as service data.
- New sinks to Cloud Storage buckets might take several hours to start routing log entries. Sinks to Cloud Storage are processed hourly while other destination types are processed in real time.
Sinks can't route log entries to linked BigQuery datasets, which are read-only. If you want to route log entries to BigQuery, the destination dataset must be write-enabled.
Sinks don't define the schema for BigQuery datasets. Instead, the first log entry received by BigQuery determines the schema for the destination table. For more information, see BigQuery schema for routed logs.
For information about how to view the log entries in a sink's destination, see View logs routed to Cloud Logging buckets.
To view the number and volume of log entries that are routed, view the
logging.googleapis.com/exports/
metrics.When a query contains multiple statements, you can either specify how those statements are joined or rely on Cloud Logging implicitly adding the conjunctive restriction,
AND
, between the statements. For example, suppose a query or filter dialog contains two statements,resource.type = "gce_instance"
andseverity >= "ERROR"
. The actual query isresource.type = "gce_instance" AND severity >= "ERROR"
. Cloud Logging supports both disjunctive restrictions,OR
, and conjunctive restrictions,AND
. When you useOR
statements, we recommend that you group the clauses with parentheses. For more information, see the Logging query language.
To create a sink, do the following:
Console
-
In the Google Cloud console, go to the Log Router page:
If you use the search bar to find this page, then select the result whose subheading is Logging.
Select the Google Cloud project in which the log entries that you want to route originate.
For example, if you want to route your Data Access log entries from the project named
Project-A
to a log bucket in the project namedProject-B
, then selectProject-A
.Select Create sink.
In the Sink details panel, enter the following details:
Sink name: Provide an identifier for the sink; note that after you create the sink, you can't rename the sink but you can delete it and create a new sink.
Sink description (optional): Describe the purpose or use case for the sink.
In the Sink destination panel, select the sink service and destination by using the Select sink service menu. Do one of the following:
To route log entries to a service that is in the same Google Cloud project, select one of the following options:
- Cloud Logging bucket: Select or create a Logging bucket.
- BigQuery dataset: Select or create the writeable dataset to receive the routed log entries. You also have the option to use partitioned tables.
- Cloud Storage bucket: Select or create the particular Cloud Storage bucket to receive the routed log entries.
- Pub/Sub topic: Select or create the particular topic to receive the routed log entries.
- Splunk: Select the Pub/Sub topic for your Splunk service.
To route log entries to a different Google Cloud project, select Google Cloud project, and then enter the fully-qualified name for the destination. For information about the syntax, see the Destination path formats.
To route log entries to a service that is in a different Google Cloud project, do the following:
- Select Other resource.
- Enter the fully-qualified name for the destination. For information about the syntax, see the Destination path formats.
Specify the log entries to include:
Go to the Choose logs to include in sink panel.
In the Build inclusion filter field, enter a filter expression that matches the log entries you want to include. To learn more about the syntax for writing filters, see Logging query language.
If you don't set a filter, all log entries from your selected resource are routed to the destination.
For example, to route all Data Access log entries to a Logging bucket, you can use the following filter:
log_id("cloudaudit.googleapis.com/data_access") OR log_id("externalaudit.googleapis.com/data_access")
The length of a filter can't exceed 20,000 characters.
To verify you entered the correct filter, select Preview logs. The Logs Explorer opens in a new tab with the filter pre-populated.
(Optional) Configure an exclusion filter to eliminate some of the included log entries:
Go to the Choose logs to filter out of sink panel.
In the Exclusion filter name field, enter a name.
In the Build an exclusion filter field, enter a filter expression that matches the log entries you want to exclude. You can also use the
sample
function to select a portion of the log entries to exclude.
You can create up to 50 exclusion filters per sink. Note that the length of a filter can't exceed 20,000 characters.
Select Create sink.
Grant the service account for the sink the permission to write log entries to your sink's destination. For more information, see Set destination permissions.
gcloud
To create a sink, do the following:
Run the following
gcloud logging sinks create
command:gcloud logging sinks create SINK_NAME SINK_DESTINATION
Before running the command, make the following replacements:
- SINK_NAME: The name of the log sink. You can't change the name of a sink after you create it.
SINK_DESTINATION: The service or project to where you want your log entries routed. Set SINK_DESTINATION with the appropriate path, as described in Destination path formats.
For example, if your sink destination is a Pub/Sub topic, then SINK_DESTINATION looks like the following:
pubsub.googleapis.com/projects/PROJECT_ID/topics/TOPIC_ID
You can also provide the following options:
--log-filter
: Use this option to set a filter that matches the log entries you want to include in your sink. If you don't provide a value for the inclusion filter, then the this filter matches all log entries.--exclusion
: Use this option to set an exclusion filter for log entries that you want to exclude your sink from routing. You can also use thesample
function to select a portion of the log entries to exclude. This option can be repeated; you can create up to 50 exclusion filters per sink.--description
: Use this option to describe the purpose or use case for the sink.
For example, to create a sink to a Logging bucket, your command might look like this:
gcloud logging sinks create my-sink logging.googleapis.com/projects/myproject123/locations/global/buckets/my-bucket \ --log-filter='logName="projects/myproject123/logs/matched"' --description="My first sink"
For more information on creating sinks using the Google Cloud CLI, see the
gcloud logging sinks
reference.If the command response contains a JSON key labeled
"writerIdentity"
, then grant the service account of the sink the permission to write to the sink destination. For more information, see Set destination permissions.You don't need to set destination permissions when the response doesn't contain a JSON key labeled
"writerIdentity"
.
REST
To create a logging sink in your Google Cloud project, use
projects.sinks.create
in the Logging API. In theLogSink
object, provide the appropriate required values in the method request body:name
: An identifier for the sink. Note that after you create the sink, you can't rename the sink, but you can delete it and create a new sink.destination
: The service and destination to where you want your log entries routed. To route log entries to a different project, or to a destination that is in another project, set thedestination
field with the appropriate path, as described in Destination path formats.For example, if your sink destination is a Pub/Sub topic, then the
destination
looks like the following:pubsub.googleapis.com/projects/PROJECT_ID/topics/TOPIC_ID
In the
LogSink
object, provide the appropriate optional information:filter
: Set thefilter
field to match the log entries you want to include in your sink. If you don't set a filter, all log entries from your Google Cloud project are routed to the destination. Note that the length of a filter can't exceed 20,000 characters.exclusions
: Set this field to match the log entries that you want to exclude from your sink. You can also use thesample
function to select a portion of the log entries to exclude. You can create up to 50 exclusion filters per sink.description
: Set this field to describe the purpose or use case for the sink.
Call
projects.sinks.create
to create the sink.If the API response contains a JSON key labeled
"writerIdentity"
, then grant the service account of the sink the permission to write to the sink destination. For more information, see Set destination permissions.You don't need to set destination permissions when the API response doesn't contain a JSON key labeled
"writerIdentity"
.
For more information on creating sinks using the
Logging API, see the LogSink
reference.
If you receive error notifications, then see Troubleshoot routing and sinks.
Destination path formats
If you route log entries to a service that is in another project, then you must provide the sink with the fully-qualified name for the service. Similarly, if you route log entries to a different Google Cloud project, then you must provide the sink with the fully-qualified name of the destination project:
Cloud Logging log bucket:
logging.googleapis.com/projects/DESTINATION_PROJECT_ID/locations/LOCATION/buckets/BUCKET_NAME
Another Google Cloud project:
logging.googleapis.com/projects/DESTINATION_PROJECT_ID
BigQuery dataset:
bigquery.googleapis.com/projects/PROJECT_ID/datasets/DATASET_ID
Cloud Storage:
storage.googleapis.com/BUCKET_NAME
Pub/Sub topic:
pubsub.googleapis.com/projects/PROJECT_ID/topics/TOPIC_ID
Manage sinks
After your sinks are created, you can perform the following actions on them. Any changes made to a sink might take a few minutes to apply:
- View details
- Update
Disable
- You can't disable the
_Required
sink. - You can disable the
_Default
sink to stop it from routing log entries to the_Default
Logging bucket. - If you want to disable the
_Default
sink for any new Google Cloud projects or folders created in your organization, then consider configuring default resource settings.
- You can't disable the
Delete
- You can't delete the
_Default
or the_Required
sinks. - When you delete a sink, it no longer routes log entries.
- If the sink has a dedicated service account, then deleting that sink also deletes the service account. Sinks created before May 22, 2023 have dedicated service accounts. Sinks created on or after May 22, 2023 have a shared service account. Deleting the sink doesn't delete the shared service account.
- You can't delete the
Troubleshoot failures
- View log volume and error rates
Following are the instructions for managing a sink in a Google Cloud project. Instead of a Google Cloud project, you can specify a billing account, folder, or organization:
Console
-
In the Google Cloud console, go to the Log Router page:
If you use the search bar to find this page, then select the result whose subheading is Logging.
In the toolbar, select the resource that contains your sink. The resource can be a project, folder, organization, or billing account.
The Log Router page displays the sinks in the selected resource. Each table row contains information about a sink's properties:
- Enabled: Indicates if the sink's state is enabled or disabled.
- Type: The sink's destination service; for example,
Cloud Logging bucket
. - Name: The sink's identifier, as provided when the sink was created;
for example
_Default
. - Description: The sink's description, as provided when the sink was created.
- Destination: Full name of the destination to which the routed log entries are sent.
- Created: The date and time that the sink was created.
- Last updated: The date and time that the sink was last edited.
For each table row, the more_vert More actions menu provides the following options:
- View sink details: Displays the sink's name, description, destination service, destination, and inclusion and exclusion filters. Selecting Edit opens the Edit Sink panel.
- Edit sink: Opens the Edit Sink panel where you can update the sink's parameters.
- Disable sink: Lets you disable the sink and stop routing log entries to the sink's destination. For more information on disabling sinks, see Stop storing logs in log buckets.
- Enable sink: Lets you enable a disabled sink and restart routing log entries to the sink's destination.
- Delete sink: Lets you delete the sink and stop routing log entries to the sink's destination.
- Troubleshoot sink: Opens the Logs Explorer where you can troubleshoot errors with the sink.
- View sink log volume and error rates: Opens the Metrics Explorer where you can view and analyze data from the sink.
To sort the table by a column, select the column name.
gcloud
To view your list of sinks for your Google Cloud project, use the
gcloud logging sinks list
command, which corresponds to the Logging API methodprojects.sinks.list
:gcloud logging sinks list
To view your list of aggregated sinks, use the appropriate option to specify the resource that contains the sink. For example, if you created the sink at the organization level, use the
--organization=ORGANIZATION_ID
option to list the sinks for the organization.To describe a sink, use the
gcloud logging sinks describe
command, which corresponds to the Logging API methodprojects.sinks.get
:gcloud logging sinks describe SINK_NAME
To update a sink, use the
gcloud logging sinks update
command, which corresponds to the API methodprojects.sink.update
.You can update a sink to change the destination, filters, and description, or to disable or re-enable the sink:
gcloud logging sinks update SINK_NAME NEW_DESTINATION --log-filter=NEW_FILTER
Omit the NEW_DESTINATION or
--log-filter
if those parts don't change.For example, to update the destination of your sink named
my-project-sink
to a new Cloud Storage bucket destination namedmy-second-gcs-bucket
, your command looks like this:gcloud logging sinks update my-project-sink storage.googleapis.com/my-second-gcs-bucket
To disable a sink, use the
gcloud logging sinks update
command, which corresponds to the API methodprojects.sink.update
, and include the--disabled
option:gcloud logging sinks update SINK_NAME --disabled
To reenable the sink, use the
gcloud logging sinks update
command, remove the--disabled
option, and include the--no-disabled
option:gcloud logging sinks update SINK_NAME --no-disabled
To delete a sink, use the
gcloud logging sinks delete
command, which corresponds to the API methodprojects.sinks.delete
:gcloud logging sinks delete SINK_NAME
For more information on managing sinks using the Google Cloud CLI, see the
gcloud logging sinks
reference.
REST
To view the sinks for your Google Cloud project, call
projects.sinks.list
.To view a sink's details, call
projects.sinks.get
.To update a sink, call
projects.sink.update
.You can update a sink's destination, filters, and description. You can also disable or re-enable the sink.
To disable a sink, set the
disabled
field in theLogSink
object totrue
, and then callprojects.sink.update
.To reenable the sink, set the
disabled
field in theLogSink
object tofalse
, and then callprojects.sink.update
.To delete a sink, call
projects.sinks.delete
.For more information about managing sinks by using the Logging API, see the
LogSink
reference.
Stop storing log entries in log buckets
You can disable the _Default
sink and any user-defined sinks. When you
disable a sink, the sink stops routing log entries to its destination.
For example, if you disable the _Default
sink, then no log entries are
routed to the _Default
bucket. The
_Default
bucket becomes empty when all of the previously stored log entries
have fulfilled the bucket's
retention period.
The following instructions illustrate how to
disable your Google Cloud project sinks that route log entries to the
_Default
log buckets:
Console
-
In the Google Cloud console, go to the Log Router page:
If you use the search bar to find this page, then select the result whose subheading is Logging.
- To find all the sinks that route log entries to the
_Default
log bucket, filter the sinks by destination, and then enter_Default
. For each sink, select more_vert Menu and then select Disable sink.
The sinks are now disabled and your Google Cloud project sinks no longer route log entries to the
_Default
bucket.
To reenable a disabled sink and restart routing log entries to the sink's destination, do the following:
-
In the Google Cloud console, go to the Log Router page:
If you use the search bar to find this page, then select the result whose subheading is Logging.
- To find all the sinks that route log entries to the
_Default
log bucket, filter the sinks by destination, and then enter_Default
. - For each sink, select more_vert Menu and then select Enable sink.
gcloud
To view your list of sinks for your Google Cloud project, use the
gcloud logging sinks list
command, which corresponds to the Logging API methodprojects.sinks.list
:gcloud logging sinks list
Identify any sinks that are routing to the
_Default
log bucket. To describe a sink, including seeing the destination name, use thegcloud logging sinks describe
command, which corresponds to the Logging API methodprojects.sinks.get
:gcloud logging sinks describe SINK_NAME
Run the
gcloud logging sinks update
command and include the--disabled
option. For example, to disable the_Default
sink, use the following command:gcloud logging sinks update _Default --disabled
The
_Default
sink is now disabled; it no longer routes log entries to the_Default
log bucket.
To disable the other sinks in your Google Cloud project that are routing
to the _Default
bucket, repeat the previous steps.
To reenable a sink, use the
gcloud logging sinks update
command, remove the --disabled
option, and include the --no-disabled
option:
gcloud logging sinks update _Default --no-disabled
REST
To view the sinks for your Google Cloud project, call the Logging API method
projects.sinks.list
.Identify any sinks that are routing to the
_Default
bucket.For example, to disable the
_Default
sink, set thedisabled
field in theLogSink
object totrue
, and then callprojects.sink.update
.The
_Default
sink is now disabled; it no longer routes log entries to the_Default
bucket.
To disable the other sinks in your Google Cloud project that are routing
to the _Default
bucket, repeat the previous steps.
To reenable a sink,
set the disabled
field in the LogSink
object to false
, and then
call projects.sink.update
.
Set destination permissions
This section describes how to grant Logging the Identity and Access Management permissions to write log entries to your sink's destination. For the full list of Logging roles and permissions, see Access control.
Cloud Logging creates a shared service account for a resource when a sink is created, unless the required service account already exists. The service account might exist because the same service account is used for all sinks in the underlying resource. Resources can be a Google Cloud project, an organization, a folder, or a billing account.
The writer identity of a sink is the identifier of the service
account associated with that sink. All sinks have a writer identity except for
sinks that write to a log bucket in the same Google Cloud project in which
the log entry originates. For the latter configuration, a service account
isn't required and therefore the sink's writer identity field
is listed as None
in the console. The
API and the Google Cloud CLI commands don't report a writer identity.
The following instructions apply to projects, folders, organizations, and billing accounts:
Console
Ensure that you have Owner access on the Google Cloud project that contains the destination. If you don't have Owner access to the destination of the sink, then ask a project owner to add the writer identity as a principal.
To get the sink's writer identity—an email address—from the new sink, do the following:
-
In the Google Cloud console, go to the Log Router page:
If you use the search bar to find this page, then select the result whose subheading is Logging.
- In the toolbar, select the project that contains the sink.
- Select more_vert Menu and then select View sink details. The writer identity appears in the Sink details panel.
-
If the value of the
writerIdentity
field contains an email address, then proceed to the next step. When the value isNone
, you don't need to configure destination permissions for the sink.Copy the sink's writer identity into your clipboard.
If the destination is a service in a different project, or if it is another project, then in the toolbar, select the destination project.
Add the service account as an IAM principal in the destination project:
-
In the Google Cloud console, go to the IAM page:
If you use the search bar to find this page, then select the result whose subheading is IAM & Admin.
Select the destination project.
Click
Grant access.Grant the service account the required IAM role:
- For Cloud Storage destinations, add the sink's writer identity
as a principal by using IAM, and then grant it the
Storage Object Creator role
(
roles/storage.objectCreator
). - For BigQuery destinations, add the sink's writer identity
as a principal by using IAM, and then grant it the
BigQuery Data Editor role
(
roles/bigquery.dataEditor
). - For Pub/Sub destinations, including Splunk, add the sink's writer identity
as a principal by using IAM, and then grant it the
Pub/Sub Publisher role
(
roles/pubsub.publisher
). - For Logging bucket destinations in different
Google Cloud projects, add the sink's writer identity as a principal by
using IAM, and then grant it the
Logs Bucket Writer role
(
roles/logging.bucketWriter
). - For Google Cloud projects destinations, add the sink's
writer identity as a principal by using IAM, and then grant it the
Logs Writer role
(
roles/logging.logWriter
). Specifically, a principal needs thelogging.logEntries.route
permission.
- For Cloud Storage destinations, add the sink's writer identity
as a principal by using IAM, and then grant it the
Storage Object Creator role
(
-
gcloud
Ensure that you have Owner access on the Google Cloud project that contains the destination. If you don't have Owner access to the destination of the sink, then ask a project owner to add the writer identity as a principal.
Get the service account from the
writerIdentity
field in your sink:gcloud logging sinks describe SINK_NAME
Locate the sink whose permissions you want to modify, and if the sink details contain a line with
writerIdentity
, then proceed to the next step. When the details don't include awriterIdentity
field, you don't need to configure destination permissions for the sink.The writer identity for the service account looks similar to the following:
serviceAccount:service-123456789012@gcp-sa-logging.iam.gserviceaccount.com
Add the service account as an IAM principal in the destination project:
Before using the following command, make the following replacements:
- PROJECT_ID: The identifier of the project.
- PRINCIPAL: An identifier for the principal that you want to
grant the role to. Principal identifiers usually have the following form:
PRINCIPAL-TYPE:ID
. For example,user:my-user@example.com
. For a full list of the formats thatPRINCIPAL
can have, see Principal identifiers. ROLE: An IAM role.
- For Cloud Storage destinations, add the sink's writer identity
as a principal by using IAM, and then grant it the
Storage Object Creator role
(
roles/storage.objectCreator
). - For BigQuery destinations, add the sink's writer identity
as a principal by using IAM, and then grant it the
BigQuery Data Editor role
(
roles/bigquery.dataEditor
). - For Pub/Sub destinations, including Splunk, add the sink's writer identity
as a principal by using IAM, and then grant it the
Pub/Sub Publisher role
(
roles/pubsub.publisher
). - For Logging bucket destinations in different
Google Cloud projects, add the sink's writer identity as a principal by
using IAM, and then grant it the
Logs Bucket Writer role
(
roles/logging.bucketWriter
). - For Google Cloud projects destinations, add the sink's
writer identity as a principal by using IAM, and then grant it the
Logs Writer role
(
roles/logging.logWriter
). Specifically, a principal needs thelogging.logEntries.route
permission.
- For Cloud Storage destinations, add the sink's writer identity
as a principal by using IAM, and then grant it the
Storage Object Creator role
(
Execute the
gcloud projects add-iam-policy-binding
command:gcloud projects add-iam-policy-binding PROJECT_ID --member=PRINCIPAL --role=ROLE
REST
We recommend that you use the Google Cloud console or the Google Cloud CLI to grant a role to service account.
Destination limitations
This section describes destination-specific limitations:
- If you route log entries to a log bucket in a different Google Cloud project, then Error Reporting doesn't analyze those log entries. For more information, see Error Reporting overview.
- If you route log entries to BigQuery, the BigQuery dataset must be write-enabled. You can't route log entries to linked datasets, which are read-only.
The following limitations apply when you route your log entries to different Google Cloud projects:
There is a one-hop limit.
For example, if you route log entries from project
A
to projectB
, then you can't route the log entries from projectB
to a different project.Audit logs aren't routed to the
_Required
log bucket in the destination project.For example, if you route log entries from project
A
to projectB
, then the_Required
log bucket in projectA
contains the audit logs for projectA
. The audit logs for projectA
aren't routed to projectB
. To route these log entries, create a sink whose destination is a log bucket.When the destination project is in a different folder or organization, then the aggregated sinks in that folder or organization don't route the log entry.
For example, suppose project
A
is in folderX
. When a log entry originates in projectA
, the log entry is processed by the aggregated sinks in folderX
and the sinks in projectA
. Now suppose that projectA
contains a sink that routes its log entries to projectB
, which is in folderY
. The log entries from projectA
pass through the sinks in projectB
; however, they don't pass through the aggregated sinks in folderY
.
To use the Logs Explorer to view the log entries routed to a project by using an aggregated sink, set the Refine scope field to storage scope, and then select a log view that provides access to those log entries.
Code samples
To use client library code to configure sinks in your chosen languages, see Logging client libraries: Log sinks.
Filter examples
Following are some filter examples that are particularly useful when creating sinks. For additional examples that might be useful as you build your inclusion filters and exclusion filters, see Sample queries.
Restore the _Default
sink filter
If you edited the filter for the _Default
sink, then you might want to restore
this sink to its original configuration. When created, the _Default
sink is
configured with the following inclusion filter and an empty exclusion filter:
NOT log_id("cloudaudit.googleapis.com/activity") AND NOT \
log_id("externalaudit.googleapis.com/activity") AND NOT \
log_id("cloudaudit.googleapis.com/system_event") AND NOT \
log_id("externalaudit.googleapis.com/system_event") AND NOT \
log_id("cloudaudit.googleapis.com/access_transparency") AND NOT \
log_id("externalaudit.googleapis.com/access_transparency")
Exclude Google Kubernetes Engine container and pod logs
To exclude Google Kubernetes Engine container and pod log entries for
GKE system namespaces
, use the following filter:
resource.type = ("k8s_container" OR "k8s_pod")
resource.labels.namespace_name = (
"cnrm-system" OR
"config-management-system" OR
"gatekeeper-system" OR
"gke-connect" OR
"gke-system" OR
"istio-system" OR
"knative-serving" OR
"monitoring-system" OR
"kube-system")
To exclude Google Kubernetes Engine node log entries for GKE
system logNames
, use the following filter:
resource.type = "k8s_node"
logName:( "logs/container-runtime" OR
"logs/docker" OR
"logs/kube-container-runtime-monitor" OR
"logs/kube-logrotate" OR
"logs/kube-node-configuration" OR
"logs/kube-node-installation" OR
"logs/kubelet" OR
"logs/kubelet-monitor" OR
"logs/node-journal" OR
"logs/node-problem-detector")
To view the volume of Google Kubernetes Engine node, pod, and container log entries stored in log buckets, use Metrics Explorer:
Exclude Dataflow logs not required for supportability
To exclude Dataflow log entries that aren't required for supportability, use the following filter:
resource.type="dataflow_step"
labels."dataflow.googleapis.com/log_type"!="system" AND labels."dataflow.googleapis.com/log_type"!="supportability"
To view the volume of Dataflow logs stored in log buckets, use Metrics Explorer.
Supportability
Although Cloud Logging lets you exclude log entries and prevent them from being stored in a log bucket, you might want to consider keeping log entries that help with supportability. Using these log entries can help you troubleshoot and identify issues with your applications.
For example, GKE system log entries are useful to troubleshoot your GKE applications and clusters because they are generated for events that happen in your cluster. These log entries can help you determine if your application code or the underlying GKE cluster is causing your application error. GKE system logs also include Kubernetes Audit Logging generated by the Kubernetes API Server component, which includes changes made using the kubectl command and Kubernetes events.
For Dataflow, we recommended that you, at a minimum, write your system
logs (labels."dataflow.googleapis.com/log_type"="system"
) and supportability
logs (labels."dataflow.googleapis.com/log_type"="supportability"
) to
log buckets. These logs
are essential for developers to observe and troubleshoot their Dataflow
pipelines, and users might not be able to use the Dataflow
Job details page to view job logs.
What's next
If you encounter issues as you use sinks to route log entries, see Troubleshoot routing logs.
To learn how to view log entries in their destinations, as well as how the logs are formatted and organized, see View logs in sink destinations.
To learn more about querying and filtering with the Logging query language, see Logging query language.