本文档介绍了如何使用 Google Cloud 控制台查询、查看和分析日志条目。您可以使用两个界面:Logs Explorer 和 Log Analytics。您可以使用这两个界面查询、查看和分析日志;不过,它们使用不同的查询语言,并且具有不同的功能。如需对日志数据进行问题排查和探索,我们建议您使用Logs Explorer。如需生成分析洞见和趋势,我们建议您使用 Log Analytics。您可以通过发出 Logging API 命令来查询日志并保存查询。您还可以使用 Google Cloud CLI 查询日志。
Logs Explorer
Logs Explorer 旨在帮助您排查问题并分析服务和应用的性能。例如,直方图会显示错误率。如果您看到错误猛增或其他有趣的现象,可以找到并查看相应的日志条目。当日志条目与错误组相关联时,日志条目会附带一个选项菜单,您可以通过该菜单访问有关错误组的更多信息。
[[["易于理解","easyToUnderstand","thumb-up"],["解决了我的问题","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["很难理解","hardToUnderstand","thumb-down"],["信息或示例代码不正确","incorrectInformationOrSampleCode","thumb-down"],["没有我需要的信息/示例","missingTheInformationSamplesINeed","thumb-down"],["翻译问题","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["最后更新时间 (UTC):2025-08-04。"],[],[],null,["# Query and view log entries\n\nThis document describes how you query, view, and analyze log entries by using\nthe Google Cloud console. There are two interfaces available to you, the\nLogs Explorer and Log Analytics. You can query, view, and analyze\nlogs with both interfaces; however, they use different query languages and they\nhave different capabilities.\nFor troubleshooting and exploration of log data, we recommend using the\nLogs Explorer. To generate insights and trends, we recommend that you\nuse Log Analytics.\nYou can query your logs and save your queries by issuing\n[Logging API](/logging/docs/reference/v2/rest/v2/entries/list) commands.\nYou can also query your logs by using\n[Google Cloud CLI](/logging/docs/api/gcloud-logging#reading_log_entries).\n\nLogs Explorer\n-------------\n\nThe Logs Explorer is designed to help you troubleshoot and analyze the\nperformance of your services and applications. For example, a histogram\ndisplays the rate of errors. If you see a spike in errors or something that\nis interesting, you can locate and view the\ncorresponding log entries. When a log entry is associated with an\n[error group](/error-reporting/docs/grouping-errors), the log entry is\nannotated with a\nmenu of options that let you access more information about the error group.\n\nThe same [query language](/logging/docs/view/logging-query-language) is\nsupported by the Cloud Logging API, the Google Cloud CLI,\nand the Logs Explorer.\nTo simplify query construction when you are using the Logs Explorer, you can\n[build queries](/logging/docs/view/building-queries) by using menus, by\nentering text, and, in some cases, by using options included with the display\nof an individual log entry.\n\nThe Logs Explorer doesn't support aggregate operations,\nlike counting the number of log entries that contain a specific pattern.\nTo perform aggregate operations, enable analytics on the log bucket and then use\nLog Analytics.\n\nFor details about searching and viewing logs with the Logs Explorer, see\n[View logs by using the Logs Explorer](/logging/docs/view/logs-explorer-interface).\n\nLog Analytics\n-------------\n\nUsing Log Analytics, you can run queries that analyze your log data, and\nthen you can view or [chart the query results](/logging/docs/analyze/charts). Charts let\nyou identify patterns and trends in your logs over time. The following\nscreenshot illustrates the charting capabilities in Log Analytics:\n\nFor example, suppose that you are troubleshooting a problem and you want to\nknow the average latency for HTTP requests issued to a specific URL over time.\nWhen a log bucket is upgraded to use Log Analytics, you can write a\n[SQL](/bigquery/docs/reference/standard-sql/query-syntax) query or use the query builder to query logs stored in your log\nbucket.\n\nThese SQL queries can also include [pipe syntax](/bigquery/docs/pipe-syntax-guide).\nBy grouping and aggregating your logs, you can gain insights into your log\ndata which can help you reduce time spent troubleshooting.\n\nLog Analytics lets you query\n[log views](/logging/docs/logs-views) or an\n[analytics view](/logging/docs/analyze/about-analytics-views). Log views have a fixed schema which\ncorresponds to the [`LogEntry`](/logging/docs/reference/v2/rest/v2/LogEntry) data structure.\nBecause the creator of an analytics view determines the schema, one use\ncase for analytics views is to transform log data from the\n`LogEntry` format into a format that is more suitable for you.\n\nYou can also use [BigQuery](/bigquery/docs/introduction)\nto query your data. For example, suppose that you want to use\nBigQuery to compare URLs in your logs with a public dataset of\nknown malicious URLs. To make your log data visible to\nBigQuery, upgrade your bucket to use Log Analytics and then\n[create a linked dataset](/logging/docs/buckets#link-bq-dataset).\n\nYou can continue to troubleshoot issues and view individual log entries in\nupgraded log buckets by using the Logs Explorer.\n\n### Restrictions\n\n- To upgrade an existing log bucket to use Log Analytics, the following\n restrictions apply:\n\n - The log bucket was created at the Google Cloud project level.\n - The log bucket is [unlocked](/logging/docs/buckets#locking-logs-buckets) unless it is the `_Required` bucket.\n - There aren't pending updates to the bucket.\n- Log entries written before a bucket is upgraded aren't immediately available.\n However, when the backfill operation completes, you can analyze these log\n entries. The backfill process might take several hours.\n\n- You can't use the **Log Analytics** page to query log views when the log bucket\n has [field-level access controls](/logging/docs/field-level-acl) configured.\n However, you can issue queries\n through the **Logs Explorer** page, and you can query a\n [linked BigQuery dataset](/logging/docs/buckets#link-bq-dataset).\n Because BigQuery doesn't honor field-level access controls, if you\n query a linked dataset, then you can query all fields in the log entries.\n\n- If you query multiple log buckets that are configured with different\n Cloud KMS keys, then the query fails unless the following\n constraints are met:\n\n - The log buckets are in the same location.\n - A folder or organization that is a parent resource of the log buckets is [configured with a default key](/logging/docs/routing/managed-encryption).\n - The default key is in the same location as the log buckets.\n\n When the previous constraints are satisfied, the parent's Cloud KMS\n key encrypts any temporary data generated by a Log Analytics query.\n- Duplicate log entries aren't removed before a query is run. This behavior\n is different than when you query log entries by using the Logs Explorer,\n which removes duplicate entries by comparing the log names, timestamps, and\n insert ID fields. For more information, see\n [Troubleshoot: There are duplicate log entries in my Log Analytics results](/logging/docs/analyze/troubleshoot#duplicate-analytics).\n\n| **Note:** If your data is managed through an [Assured Workloads environment](/assured-workloads/docs/key-concepts), then this feature might be impacted or restricted. For information, see [Restrictions and limitations in Assured Workloads](/assured-workloads/docs/eu-sovereign-controls-restrictions-limitations#features_logging).\n\nPricing\n-------\n\nCloud Logging doesn't charge to route logs to a\nsupported destination; however, the destination might apply charges.\nWith the exception of the `_Required` log bucket,\nCloud Logging charges to stream logs into log buckets and\nfor storage longer than the default retention period of the log bucket.\n\nCloud Logging doesn't charge for copying logs,\nfor creating [log scopes](/logging/docs/log-scope/create-and-manage)\nor [analytics views](/logging/docs/analyze/about-analytics-views),\nor for queries issued through the\n**Logs Explorer** or **Log Analytics** pages.\n\nFor more information, see the following documents:\n\n- The Cloud Logging sections of the [Google Cloud Observability pricing](https://cloud.google.com/stackdriver/pricing) page.\n- Costs when routing log data to other Google Cloud services:\n\n - [Cloud Storage pricing](https://cloud.google.com/storage/pricing)\n - [BigQuery pricing](https://cloud.google.com/bigquery/pricing#data_ingestion_pricing)\n - [Pub/Sub pricing](https://cloud.google.com/pubsub/pricing)\n- [VPC flow log generation charges](https://cloud.google.com/vpc/network-pricing#network-telemetry) apply when you send and then exclude your Virtual Private Cloud flow logs from Cloud Logging.\n\nThere are no BigQuery ingestion or storage costs when\nyou upgrade a bucket to use Log Analytics and then\ncreate a [linked dataset](/bigquery/docs/analytics-hub-introduction#linked_datasets).\nWhen you create a linked dataset for a log bucket, you don't ingest your\nlog data into BigQuery. Instead, you get read access\nto the log data stored in your log bucket through the linked dataset.\n\nBigQuery analysis charges apply when you run SQL queries on\nBigQuery linked datasets, which includes using the\n**BigQuery Studio** page, the BigQuery API, and the\nBigQuery command-line tool.\n\nBlogs\n-----\n\nFor more information about Log Analytics, see the following blog posts:\n\n- For an overview of Log Analytics, see [Log Analytics in Cloud Logging is now GA](/blog/products/devops-sre/log-analytics-in-cloud-logging-is-now-ga).\n- To learn about creating charts generated by Log Analytics queries and saving those charts to custom dashboards, see [Announcing Log Analytics charts and dashboards in Cloud Logging in\n public preview](/blog/products/management-tools/new-log-analytics-charts-and-dashboards-in-cloud-logging).\n- To learn about analyzing audit logs by using Log Analytics, see [Gleaning security insights from audit logs with Log Analytics](/blog/products/identity-security/gleaning-security-insights-from-audit-logs-with-log-analytics).\n- If you route logs to BigQuery and want to understand the difference between that solution and using Log Analytics, then see [Moving to Log Analytics for BigQuery export users](/blog/products/data-analytics/moving-to-log-analytics-for-bigquery-export-users).\n\nWhat's next\n-----------\n\n- [Create a log bucket and upgrade it to use Log Analytics](/logging/docs/buckets#create_bucket)\n- [Upgrade an existing bucket to use Log Analytics](/logging/docs/buckets#upgrade-bucket)\n- Query and view logs:\n\n - [Log Analytics: Query and analyze logs](/logging/docs/analyze/query-and-view)\n - [Logs Explorer: Query and view logs](/logging/docs/view/logs-explorer-interface)\n- Sample queries:\n\n - [Log Analytics: SQL examples](/logging/docs/analyze/examples)\n - [Logs Explorer: Logging query language examples](/logging/docs/view/query-library)"]]