Mit Sammlungen den Überblick behalten
Sie können Inhalte basierend auf Ihren Einstellungen speichern und kategorisieren.
Tabelle mit Änderungsstream erstellen und Änderungen erfassen
Hier erfahren Sie, wie Sie eine Bigtable-Tabelle mit aktiviertem Änderungsstream einrichten, eine Änderungsstreampipeline ausführen, Änderungen an Ihrer Tabelle vornehmen und sich die gestreamten Änderungen ansehen.
Hinweise
In the Google Cloud console, on the project selector page,
select or create a Google Cloud project.
[[["Leicht verständlich","easyToUnderstand","thumb-up"],["Mein Problem wurde gelöst","solvedMyProblem","thumb-up"],["Sonstiges","otherUp","thumb-up"]],[["Schwer verständlich","hardToUnderstand","thumb-down"],["Informationen oder Beispielcode falsch","incorrectInformationOrSampleCode","thumb-down"],["Benötigte Informationen/Beispiele nicht gefunden","missingTheInformationSamplesINeed","thumb-down"],["Problem mit der Übersetzung","translationIssue","thumb-down"],["Sonstiges","otherDown","thumb-down"]],["Zuletzt aktualisiert: 2025-08-27 (UTC)."],[[["\u003cp\u003eThis guide demonstrates how to create a Bigtable table with change stream enabled, allowing for the capture of table modifications.\u003c/p\u003e\n"],["\u003cp\u003eA Dataflow pipeline is initiated to capture and monitor the change stream from the Bigtable instance.\u003c/p\u003e\n"],["\u003cp\u003eUsers can make changes to the Bigtable table, with these changes being reflected in the worker logs, showing the change stream in action.\u003c/p\u003e\n"],["\u003cp\u003eThe tutorial includes instructions on how to clean up resources, such as disabling the change stream, deleting the table, and stopping the pipeline to avoid unnecessary charges.\u003c/p\u003e\n"]]],[],null,["# Create a change stream-enabled table and capture changes\n========================================================\n\nLearn how to set up a Bigtable table with a change stream enabled, run a change stream pipeline, make changes to your table, and then see the changes streamed.\n\nBefore you begin\n----------------\n\n1. In the Google Cloud console, on the project selector page,\n select or create a Google Cloud project.\n\n [Go to project selector](https://console.cloud.google.com/projectselector2/home/dashboard)\n2.\n [Verify that billing is enabled for your Google Cloud project](/billing/docs/how-to/verify-billing-enabled#confirm_billing_is_enabled_on_a_project).\n\n3.\n\n\n Enable the Dataflow, Cloud Bigtable API, and Cloud Bigtable Admin API APIs.\n\n\n [Enable the APIs](https://console.cloud.google.com/flows/enableapi?apiid=dataflow.googleapis.com,bigtable.googleapis.com,bigtableadmin.googleapis.com)\n4. In the Google Cloud console, activate Cloud Shell.\n\n [Activate Cloud Shell](https://console.cloud.google.com/?cloudshell=true)\n\nCreate a table with a change stream enabled\n-------------------------------------------\n\n1. In the Google Cloud console, go to the Bigtable **Instances** page.\n\n [Go to Instances](https://console.cloud.google.com/bigtable/instances)\n2. Click the ID of the instance that you are using for this quickstart.\n\n If you don't have an instance available, [create an instance](/bigtable/docs/creating-instance) with the default\n configurations in a region near you.\n3. In the left navigation pane, click **Tables**.\n\n4. Click **Create a table**.\n\n5. Name the table `change-streams-quickstart`.\n\n6. Add a column family named `cf`.\n\n7. Select **Enable change stream**.\n\n8. Click **Create**.\n\nInitialize a data pipeline to capture the change stream\n-------------------------------------------------------\n\n1. In the Cloud Shell, run the following commands to download the code and run it.\n\n git clone https://github.com/GoogleCloudPlatform/java-docs-samples.git\n cd java-docs-samples/bigtable/beam/change-streams\n mvn compile exec:java -Dexec.mainClass=ChangeStreamsHelloWorld \\\n \"-Dexec.args=--project=\u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e --bigtableProjectId=\u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e \\\n --bigtableInstanceId=\u003cvar translate=\"no\"\u003eBIGTABLE_INSTANCE_ID\u003c/var\u003e --bigtableTableId=change-streams-quickstart \\\n --runner=dataflow --region=\u003cvar translate=\"no\"\u003eBIGTABLE_REGION\u003c/var\u003e --experiments=use_runner_v2\"\n\n Replace the following:\n - \u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e: the ID of the project that you are using\n - \u003cvar translate=\"no\"\u003eBIGTABLE_INSTANCE_ID\u003c/var\u003e: the ID of the instance to contain the new table\n - \u003cvar translate=\"no\"\u003eBIGTABLE_REGION\u003c/var\u003e: the region that your Bigtable instance is in, such as `us-east5`\n2. In the Google Cloud console, go to the **Dataflow** page.\n\n [Go to Dataflow](https://console.cloud.google.com/dataflow)\n3. Click the job with a name that begins with **changestreamquickstart**.\n\n4. At the bottom of the screen, click **Show** to open the logs panel.\n\n5. Click **Worker logs** to monitor the output of the change stream.\n\n6. In the Cloud Shell, write some data to Bigtable to see the change stream process.\n\n cbt -instance=\u003cvar translate=\"no\"\u003eBIGTABLE_INSTANCE_ID\u003c/var\u003e -project=\u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e \\\n import change-streams-quickstart quickstart-data.csv column-family=cf\n\n7. In the Google Cloud console, make sure that **Severity** is set to at least `Info`.\n\n8. The worker log output logs contain this:\n\n Change captured: user123#2023,USER,SetCell,cf,col1,abc\n Change captured: user546#2023,USER,SetCell,cf,col1,def\n Change captured: user789#2023,USER,SetCell,cf,col1,ghi\n\n | **Note:** The log might take a few minutes to appear while the job initializes.\n\nClean up\n--------\n\n\nTo avoid incurring charges to your Google Cloud account for\nthe resources used on this page, follow these steps.\n\n1. Disable the change stream on the table\n\n gcloud bigtable instances tables update change-streams-quickstart --instance=\u003cvar translate=\"no\"\u003eBIGTABLE_INSTANCE_ID\u003c/var\u003e \\\n --clear-change-stream-retention-period\n\n2. Delete the table `change-streams-quickstart`:\n\n cbt -instance=\u003cvar translate=\"no\"\u003eBIGTABLE_INSTANCE_ID\u003c/var\u003e -project=\u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e deletetable change-streams-quickstart\n\n3. Stop the change stream pipeline:\n\n 1. In the Google Cloud console, go to the Dataflow **Jobs** page.\n\n [Go to Jobs](https://console.cloud.google.com/dataflow/jobs)\n 2. Select your streaming job from the job list.\n\n 3. In the navigation, click **Stop**.\n\n 4. In the **Stop job** dialog, cancel your pipeline, and then click **Stop job.**\n\n4. Optional: Delete the instance if you created a new one for this quickstart:\n\n cbt deleteinstance \u003cvar translate=\"no\"\u003eBIGTABLE_INSTANCE_ID\u003c/var\u003e\n\nWhat's next\n-----------\n\n- [Read through the change streams documentation](/bigtable/docs/change-streams-overview).\n- [Try a more complex change stream example](/bigtable/docs/change-streams-tutorial)."]]