[[["易于理解","easyToUnderstand","thumb-up"],["解决了我的问题","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["很难理解","hardToUnderstand","thumb-down"],["信息或示例代码不正确","incorrectInformationOrSampleCode","thumb-down"],["没有我需要的信息/示例","missingTheInformationSamplesINeed","thumb-down"],["翻译问题","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["最后更新时间 (UTC):2025-09-02。"],[[["\u003cp\u003eYou can view high-level stream information, including its name, status, and the source and destination connection profiles it uses.\u003c/p\u003e\n"],["\u003cp\u003eDetailed stream information available includes the stream's region, creation/update times, labels, CDC method, and included/excluded tables and schemas.\u003c/p\u003e\n"],["\u003cp\u003eFor BigQuery destinations, details such as the dataset type, write mode, and staleness limit configurations can be viewed.\u003c/p\u003e\n"],["\u003cp\u003eFor Cloud Storage destinations, details on the destination location and the output file format (Avro or JSON) are available.\u003c/p\u003e\n"],["\u003cp\u003eYou can view data encryption information, event processing statistics, data freshness, and links to alerting policies for the stream.\u003c/p\u003e\n"]]],[],null,["# View a stream\n\nIn this page, you learn how to get information about your streams.\n\nTo view high-level and detailed information about a stream, do the following:\n\n1. Go to the **Streams** page in the Google Cloud Console.\n\n [Go to the Streams page](https://console.cloud.google.com/datastream/streams)\n2. Click the stream for which you want to see detailed information. This information appears on the **Stream details** page.\n\nView high-level information\n---------------------------\n\nHigh-level information about a stream includes:\n\n- The name and status of the stream.\n- The source and destination connection profiles that the stream uses to transfer data from a source database into a destination.\n- The profile types of the connection profiles.\n\nView detailed information\n-------------------------\n\nIn addition to viewing high-level information about a stream, you can click a stream to see additional information, including:\n\n- The region where the stream is stored. Streams, like all resources, are saved in a region. Streams can only use connection profiles and private connectivity configurations that are in the same region.\n- When the stream was created, last updated, or recovered.\n- Any labels added to the stream. These labels are used to organize the stream.\n- Depending on the source database, the CDC method that you selected for the stream.\n- The tables and schemas in the source database which Datastream should include when processing the stream.\n- The tables and schemas in the source database that Datastream should exclude when processing the stream.\n- For all sources apart from PostgreSQL, the CDC method set for the stream.\n- For Oracle sources: the log file access method.\n- Whether [historical backfill](/datastream/docs/create-a-stream#backfillhistoricaldata) is enabled or disabled for the stream.\n - If it's enabled, how many schemas and tables in the source database are excluded from backfilling.\n- For BigQuery destinations:\n\n - Whether the destination dataset is a dynamic or default dataset.\n - Whether the write mode for the stream is set to **Merge** or **Append-only**.\n - If you selected **Merge mode**, the staleness limit configuration applied to the new BigQuery tables created by Datastream.\n\n Additionally, for BigLake Iceberg tables destinations:\n - The identifier of the BigQuery connection that you use for your stream.\n - The uniform resource identifier (URI) of the Cloud Storage bucket where you store your data.\n- For Cloud Storage destinations:\n\n - The location of the destination into which the stream transfers schemas, tables, and data from a source database.\n - The format of files written to the destination. Datastream supports two output formats: Avro and JSON.\n- Whether your data is encrypted with a key that's managed by Google (Google-managed) or by you (Customer-managed).\n\n- A link and the path to the customer-managed encryption key (if you're managing the encryption).\n\n- A link to create an alerting policy for the stream.\n\n- A status of how many backfills are in progress, pending, or failed.\n\n- The number of events that are processed and loaded to the destination by Datastream in the last 7 days.\n\n- The number of events that Datastream couldn't process in the last 7\n days.\n\n - The Data freshness graph. This graph displays the time gap between data residing in the source and the data being transferred into the destination by the stream. This is calculated as the time elapsed since Datastream last checked for new data in the source.\n\n For more information about this graph, see\n [Monitor a stream](/datastream/docs/monitor-a-stream).\n\nWhat's next\n-----------\n\n- To learn more about streams, see [Stream lifecycle](/datastream/docs/stream-states-and-actions).\n- To learn how to modify your streams, see [Modify a stream](/datastream/docs/modify-a-stream).\n- To learn how to monitor a stream, see [Monitor a stream](/datastream/docs/monitor-a-stream).\n- To learn how to delete an existing stream, see [Delete a stream](/datastream/docs/delete-a-stream)."]]