Klik aliran data yang ingin Anda lihat detail informasinya. Informasi ini muncul di halaman Detail aliran data.
Melihat informasi tingkat tinggi
Informasi tingkat tinggi tentang streaming mencakup:
Nama dan status streaming.
Profil koneksi sumber dan tujuan yang digunakan aliran data untuk mentransfer data dari database sumber ke tujuan.
Jenis profil dari profil koneksi.
Melihat informasi mendetail
Selain melihat informasi tingkat tinggi tentang aliran data, Anda dapat mengklik aliran data untuk melihat informasi tambahan, termasuk:
Region tempat penyimpanan streaming. Seperti halnya semua resource, aliran data disimpan di region. Aliran data hanya dapat menggunakan profil koneksi dan konfigurasi konektivitas pribadi yang berada di region yang sama.
Saat aliran dibuat, terakhir diperbarui, atau dipulihkan.
Label apa pun yang ditambahkan ke streaming. Label ini digunakan untuk mengatur streaming.
Bergantung pada database sumber, metode CDC yang Anda pilih untuk aliran.
Tabel dan skema dalam database sumber yang harus disertakan Datastream saat memproses stream.
Tabel dan skema di database sumber yang harus dikecualikan oleh Datastream saat memproses aliran data.
Untuk semua sumber selain PostgreSQL, metode CDC ditetapkan untuk aliran.
Jika diaktifkan, jumlah skema dan tabel dalam database sumber yang dikecualikan dari pengisian ulang.
Untuk tujuan BigQuery:
Apakah set data tujuan merupakan set data dinamis atau default.
Apakah mode penulisan untuk streaming ditetapkan ke Gabungkan atau Khusus tambahkan.
Jika Anda memilih Merge mode, konfigurasi batas keusangan akan diterapkan ke tabel BigQuery baru yang dibuat oleh Datastream.
Selain itu, untuk tujuan tabel BigLake Iceberg:
ID koneksi BigQuery yang Anda gunakan untuk
streaming.
Uniform resource identifier (URI) bucket Cloud Storage tempat Anda menyimpan data.
Untuk tujuan Cloud Storage:
Lokasi tujuan tempat aliran mentransfer skema, tabel, dan data dari database sumber.
Format file yang ditulis ke tujuan. Datastream mendukung dua format output: Avro dan JSON.
Apakah data Anda dienkripsi dengan kunci yang dikelola oleh Google (dikelola Google) atau oleh Anda (dikelola Pelanggan).
Link dan jalur ke kunci enkripsi yang dikelola pelanggan (jika Anda mengelola enkripsi).
Link untuk membuat kebijakan pemberitahuan untuk streaming.
Status jumlah pengisian ulang yang sedang berlangsung, tertunda, atau gagal.
Jumlah peristiwa yang diproses dan dimuat ke tujuan oleh Datastream dalam 7 hari terakhir.
Jumlah peristiwa yang tidak dapat diproses Datastream dalam 7 hari terakhir.
Grafik Keaktualan data. Grafik ini menampilkan selisih waktu antara data
yang berada di sumber dan data yang ditransfer ke tujuan
oleh aliran. Nilai ini dihitung sebagai waktu yang berlalu sejak
Datastream terakhir kali memeriksa data baru di sumber.
Untuk mengetahui informasi selengkapnya tentang grafik ini, lihat
Memantau stream.
[[["Mudah dipahami","easyToUnderstand","thumb-up"],["Memecahkan masalah saya","solvedMyProblem","thumb-up"],["Lainnya","otherUp","thumb-up"]],[["Sulit dipahami","hardToUnderstand","thumb-down"],["Informasi atau kode contoh salah","incorrectInformationOrSampleCode","thumb-down"],["Informasi/contoh yang saya butuhkan tidak ada","missingTheInformationSamplesINeed","thumb-down"],["Masalah terjemahan","translationIssue","thumb-down"],["Lainnya","otherDown","thumb-down"]],["Terakhir diperbarui pada 2025-08-12 UTC."],[[["\u003cp\u003eYou can view high-level stream information, including its name, status, and the source and destination connection profiles it uses.\u003c/p\u003e\n"],["\u003cp\u003eDetailed stream information available includes the stream's region, creation/update times, labels, CDC method, and included/excluded tables and schemas.\u003c/p\u003e\n"],["\u003cp\u003eFor BigQuery destinations, details such as the dataset type, write mode, and staleness limit configurations can be viewed.\u003c/p\u003e\n"],["\u003cp\u003eFor Cloud Storage destinations, details on the destination location and the output file format (Avro or JSON) are available.\u003c/p\u003e\n"],["\u003cp\u003eYou can view data encryption information, event processing statistics, data freshness, and links to alerting policies for the stream.\u003c/p\u003e\n"]]],[],null,["# View a stream\n\nIn this page, you learn how to get information about your streams.\n\nTo view high-level and detailed information about a stream, do the following:\n\n1. Go to the **Streams** page in the Google Cloud Console.\n\n [Go to the Streams page](https://console.cloud.google.com/datastream/streams)\n2. Click the stream for which you want to see detailed information. This information appears on the **Stream details** page.\n\nView high-level information\n---------------------------\n\nHigh-level information about a stream includes:\n\n- The name and status of the stream.\n- The source and destination connection profiles that the stream uses to transfer data from a source database into a destination.\n- The profile types of the connection profiles.\n\nView detailed information\n-------------------------\n\nIn addition to viewing high-level information about a stream, you can click a stream to see additional information, including:\n\n- The region where the stream is stored. Streams, like all resources, are saved in a region. Streams can only use connection profiles and private connectivity configurations that are in the same region.\n- When the stream was created, last updated, or recovered.\n- Any labels added to the stream. These labels are used to organize the stream.\n- Depending on the source database, the CDC method that you selected for the stream.\n- The tables and schemas in the source database which Datastream should include when processing the stream.\n- The tables and schemas in the source database that Datastream should exclude when processing the stream.\n- For all sources apart from PostgreSQL, the CDC method set for the stream.\n- For Oracle sources: the log file access method.\n- Whether [historical backfill](/datastream/docs/create-a-stream#backfillhistoricaldata) is enabled or disabled for the stream.\n - If it's enabled, how many schemas and tables in the source database are excluded from backfilling.\n- For BigQuery destinations:\n\n - Whether the destination dataset is a dynamic or default dataset.\n - Whether the write mode for the stream is set to **Merge** or **Append-only**.\n - If you selected **Merge mode**, the staleness limit configuration applied to the new BigQuery tables created by Datastream.\n\n Additionally, for BigLake Iceberg tables destinations:\n - The identifier of the BigQuery connection that you use for your stream.\n - The uniform resource identifier (URI) of the Cloud Storage bucket where you store your data.\n- For Cloud Storage destinations:\n\n - The location of the destination into which the stream transfers schemas, tables, and data from a source database.\n - The format of files written to the destination. Datastream supports two output formats: Avro and JSON.\n- Whether your data is encrypted with a key that's managed by Google (Google-managed) or by you (Customer-managed).\n\n- A link and the path to the customer-managed encryption key (if you're managing the encryption).\n\n- A link to create an alerting policy for the stream.\n\n- A status of how many backfills are in progress, pending, or failed.\n\n- The number of events that are processed and loaded to the destination by Datastream in the last 7 days.\n\n- The number of events that Datastream couldn't process in the last 7\n days.\n\n - The Data freshness graph. This graph displays the time gap between data residing in the source and the data being transferred into the destination by the stream. This is calculated as the time elapsed since Datastream last checked for new data in the source.\n\n For more information about this graph, see\n [Monitor a stream](/datastream/docs/monitor-a-stream).\n\nWhat's next\n-----------\n\n- To learn more about streams, see [Stream lifecycle](/datastream/docs/stream-states-and-actions).\n- To learn how to modify your streams, see [Modify a stream](/datastream/docs/modify-a-stream).\n- To learn how to monitor a stream, see [Monitor a stream](/datastream/docs/monitor-a-stream).\n- To learn how to delete an existing stream, see [Delete a stream](/datastream/docs/delete-a-stream)."]]