[[["易于理解","easyToUnderstand","thumb-up"],["解决了我的问题","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["很难理解","hardToUnderstand","thumb-down"],["信息或示例代码不正确","incorrectInformationOrSampleCode","thumb-down"],["没有我需要的信息/示例","missingTheInformationSamplesINeed","thumb-down"],["翻译问题","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["最后更新时间 (UTC):2025-08-04。"],[],[],null,["# Cloud Logging overview\n\nThis document provides an overview of Cloud Logging, which is a real-time\nlog-management system with storage, search, analysis, and monitoring support.\nCloud Logging automatically collects log data from Google Cloud resources.\nYour applications, on-premise resources,\nand resources from other cloud providers can send log data to Cloud Logging.\nYou can also configure alerting policies so that Cloud Monitoring notifies\nyou when certain kinds of events are reported in your log data. For regulatory\nor security reasons, you can determine where your log data is stored.\n\nCollect log data from your applications and third-party software\n----------------------------------------------------------------\n\nYou can collect log data from applications that you write by using a\n[client library](/logging/docs/reference/libraries) to instrument your\napplication. However, it's not always\nnecessary to instrument your application. For example, for some\nconfigurations you can use the [Ops Agent](/logging/docs/agent/ops-agent)\nto send log data that were written to\n`stdout` or `stderr` to your Google Cloud project.\n\nYou can also collect log data from your third-party applications, like `nginx`,\nby installing the Ops Agent and then configuring it to write log data\nfrom that application to your Google Cloud project.\n\nSee [Which should you use: Logging agent or client library?](/logging/docs/agent-or-library)\nfor information that can help you decide which approach best suits your\nrequirements.\n\nQuery, view, and analyze log data\n---------------------------------\n\nTo view and analyze your log data, use the Logs Explorer or Log Analytics\npages of the Google Cloud console:\n\n- Logs Explorer: This interface lets you view individual\n log entries and to find and view related log entries. The interface also\n annotates log entries when they are part of an [error group](/error-reporting/docs/grouping-errors),\n or when trace data is available. We recommend this interface when you\n want to troubleshoot your services and applications.\n\n- Log Analytics: This interface lets you query your log data with\n [SQL](/bigquery/docs/reference/standard-sql/query-syntax), and to find trends and patterns in that data.\n For example, you can compute the average latency for HTTP requests issued\n to a specific URL over time, and monitor whether the latency has any\n fluctuations.\n\nFor more information, see [Query and view log data](/logging/docs/log-analytics).\n\nVisualize and monitor your log data\n-----------------------------------\n\nYou can configure Cloud Logging to notify you when certain kinds of events\noccur in your log data. These notifications might be sent when a particular\npattern appears in a log entry, or when a trend is detected in your log data.\nIf you're interested in viewing the error rates of your Google Cloud services,\nthen you can view the Cloud Logging dashboard, which is preconfigured.\n\nTo get notified when a particular message is part of a log entry,\nlike when a critical security-related event occurs,\ncreate a log-based alerting policy. These policies are useful for important\nbut rare events, like the following:\n\n- An event appears in an audit log entry. For example, a user accesses the security key of a service account.\n- A deployment message is in a log entry.\n\nTo get notified when a trend occurs in your log data, create a log-based metric\nand monitor the metric with an alerting policy.\nA *log-based metric* either counts the number of log entries that match some\ncriterion, or they extract and organize information like response times\ninto histograms. Log-based metrics are suitable when you want\nto do any of the following:\n\n- Monitor the count of occurrences of a message in your log data, like the number of log entries that record a status of error.\n- Observe trends in your data, like latency values in your log data, and receive a notification if the values change in an unacceptable way.\n- Create charts to display the numeric data extracted from your log data.\n\nFor more information, see [Monitor your log data](/logging/docs/alerting/monitoring-logs).\n\nLog storage and retention\n-------------------------\n\nYou don't have to configure the location where log data is stored.\nBy default, Google Cloud projects, billing accounts, folders, and organization\nresources automatically store the log data that originates in the resource.\nFor example, if your Google Cloud project contains a\nCompute Engine instance, then the log data\nCompute Engine generates is automatically stored.\n\nYou can configure a number of aspects about\nyour log storage, such as which log data are stored,\nwhich are discarded, and where the log data are stored.\nFor more information, see [Store log entries](/logging/docs/store-log-entries).\n\nLog entries are stored for a specified length of time and are\nthen deleted. For more information, see\n[Logs retention periods](/logging/quotas#logs_retention_periods).\n\nLog routing\n-----------\n\nYou can *route*, or forward, log entries to the following destinations:\n\n- Google Cloud project\n\n- Log bucket\n\n\u003c!-- --\u003e\n\n- BigQuery dataset\n\n\u003c!-- --\u003e\n\n- Cloud Storage bucket\n\n\u003c!-- --\u003e\n\n- Pub/Sub topic, which provides support for third-party integrations like Splunk or Datadog.\n\nWhen log data is routed, the destination can be in a different\nresource from where the log data originates. For example, you can route\nlog data from one project to a log bucket stored in a different project.\n\nFor more information, see [Route log entries](/logging/docs/routing/overview).\n\nCategories of log data\n----------------------\n\nLog categories are meant to help describe the logging information available\nto you; the categories aren't mutually exclusive:\n\n- *Platform* log entries are written by Google Cloud services. These\n log entries can help you debug and troubleshoot issues, and help you better\n understand the Google Cloud services you're using.\n\n- *Component* log entries are generated by Google Cloud-provided software\n components that run on your systems. For example, GKE\n provides software components that users can run on their own virtual\n machine or in their own data center. These log entries are often used to\n provide user support.\n\n- *Security* log entries help you answer \"who did what, where, and when\":\n\n - [Cloud Audit Logs](/logging/docs/audit) provide information about administrative activities and accesses within your Google Cloud resources. Enabling audit logs helps your security, auditing, and compliance entities monitor Google Cloud data and systems for possible vulnerabilities or external data misuse. For a list of supported services, see [Google Cloud services with audit logs](/logging/docs/audit/services).\n\n \u003c!-- --\u003e\n\n - [Access Transparency logs](/assured-workloads/access-transparency/docs/overview) record actions taken by Google Cloud staff when accessing your Google Cloud content. Access Transparency logs can help you track compliance with your legal and regulatory requirements for your organization. For a list of supported services, see [Google Cloud services with Access Transparency logs](/assured-workloads/access-transparency/docs/supported-services).\n\n\u003c!-- --\u003e\n\n- *User-written* log entries are logs written by custom applications and services. Typically, this data is written to Cloud Logging by using the [Ops Agent](/logging/docs/agent/ops-agent), the [Cloud Logging API](/logging/docs/reference/api-overview), or the [Cloud Logging client libraries](/logging/docs/reference/libraries).\n\n\u003c!-- --\u003e\n\n- *Multicloud* and *Hybrid-cloud* log entries refer to log data from other cloud providers like Microsoft Azure and [logs from on-premises infrastructure](/architecture/logging-on-premises-resources-with-bindplane).\n\nAccess control\n--------------\n\nIdentity and Access Management (IAM) permissions and roles control access to log buckets.\nYou can grant predefined roles to principals, or you can create custom roles.\nFor more information about required permissions, see\n[Access control](/logging/docs/access-control).\n\nPricing\n-------\n\nTo learn about pricing for Cloud Logging, see the [Google Cloud Observability pricing](https://cloud.google.com/stackdriver/pricing) page.\n\nWhat's next\n-----------\n\n- [Log entry data-model](/logging/docs/log-entry-data-model).\n- [Store log data](/logging/docs/store-log-entries).\n- [Route log data](/logging/docs/routing/overview).\n- [Query and view logs overview](/logging/docs/log-analytics)."]]