Stay organized with collections
Save and categorize content based on your preferences.
In this page, you'll find best practices for using Datastream. These include general best practices when using Datastream.
Change a stream's source database
In some cases, you may have to change the source database of a stream. For example, you may have to modify the stream to replicate from a replica instead of from the primary database instance.
The Datastream dashboard contains a great deal of information. This information can be helpful for debugging purposes. Additional information can be found in the logs, which are available in Cloud Logging.
Datastream alerts
There's no default alert set up for Datastream. For example, you can create an alerting policy for the Data freshness metric by clicking the Create alerting policy link in the Overview tab. For the remaining metrics, follow these steps:
In the Google Cloud console, go to the notificationsAlerting page:
Optional: You might need to disable the Active filter to view all available metrics.
Search for the metric that you want to monitor under Datastream Stream.
Click Apply.
Optional: Enter the required details in the Add filters and Transform data sections. Click Next.
Enter the required information in the Configure alert trigger section. Click Next.
Configure your notifications in the Configure notifications and finalize alert section.
Review your alert and click Create policy when ready.
For detailed information about how to complete each of these steps, see
Create alerting policy.
We recommend creating alerts for the following Datastream metrics:
Data freshness
Stream unsupported event count
Stream total latencies
An alert on any of these metrics can indicate a problem with either the stream or the source database.
How many tables can a single stream handle?
We recommend that a single stream includes up to 10,000 tables. There's no limit to the size of the tables. If you need to create a stream with more tables, then the stream might enter an error state. To avoid this, consider splitting the source into multiple streams.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-25 UTC."],[[["\u003cp\u003eThis page covers best practices for using Datastream, including modifying a stream's source database and creating custom alerts.\u003c/p\u003e\n"],["\u003cp\u003eTo change a stream's source database, create a connection profile for the replica instance, start a new stream with historical backfill disabled, and then pause the original stream.\u003c/p\u003e\n"],["\u003cp\u003eDatastream offers a dashboard for debugging, and custom alerting policies can be set up for key metrics like data freshness, stream unsupported event count, and stream total latencies.\u003c/p\u003e\n"],["\u003cp\u003eFor optimal performance, it is recommended that a single Datastream stream includes no more than 10,000 tables, with consideration for the impact on the source database if multiple streams are used.\u003c/p\u003e\n"]]],[],null,["# General best practices when using Datastream\n\nIn this page, you'll find best practices for using Datastream. These include general best practices when using Datastream.\n\nChange a stream's source database\n---------------------------------\n\nIn some cases, you may have to change the source database of a stream. For example, you may have to modify the stream to replicate from a replica instead of from the primary database instance.\n\n1. [Create a connection profile](/datastream/docs/create-connection-profiles) for the replica instance.\n2. [Create a stream](/datastream/docs/create-a-stream), using the connection profile for the replica that you created and the existing connection profile for the destination.\n3. [Start the stream](/datastream/docs/run-a-stream#startastream) with historical backfill disabled. When the stream is started, it will bring only the data from the binary logs.\n4. Optional. After the stream is running, [modify it](/datastream/docs/modify-a-stream) to enable automatic backfill.\n5. [Pause the stream](/datastream/docs/run-a-stream#pauseastream) that's reading from the primary instance.\n6. Optional. [Delete the stream](/datastream/docs/delete-a-stream) that was streaming data from the primary instance.\n7. Optional. [Delete the connection profile](/datastream/docs/delete-a-connection-profile) for the primary instance.\n\nAlert and monitor in Datastream\n-------------------------------\n\nThe Datastream dashboard contains a great deal of information. This information can be helpful for debugging purposes. Additional information can be found in the logs, which are available in Cloud Logging.\n| **Tip:** Use [Google Cloud Monitoring](/monitoring) to create a custom dashboard to suit your business needs.\n\n### Datastream alerts\n\nThere's no default alert set up for Datastream. For example, you can create an alerting policy for the **Data freshness** metric by clicking the *Create alerting policy* link in the **Overview** tab. For the remaining metrics, follow these steps:\n\n1. In the Google Cloud console, go to the *notifications* **Alerting** page:\n\n [Go to **Alerting**](https://console.cloud.google.com/monitoring/alerting)\n2. Click **Create policy**.\n\n3. Click the **Select a metric** drop-down.\n\n4. In the filter field, enter `Datastream`.\n\n5. Optional: You might need to disable the **Active** filter to view all available metrics.\n\n6. Search for the metric that you want to monitor under **Datastream Stream**.\n\n7. Click **Apply**.\n\n8. Optional: Enter the required details in the **Add filters** and **Transform data** sections. Click **Next**.\n\n9. Enter the required information in the **Configure alert trigger** section. Click **Next**.\n\n10. Configure your notifications in the **Configure notifications and finalize alert** section.\n\n11. Review your alert and click **Create policy** when ready.\n\n For detailed information about how to complete each of these steps, see\n [Create alerting policy](/monitoring/alerts/using-alerting-ui#create-policy).\n\nWe recommend creating alerts for the following Datastream[metrics](/datastream/docs/monitor-a-stream#monitorstreams):\n\n- Data freshness\n- Stream unsupported event count\n- Stream total latencies\n\nAn alert on any of these metrics can indicate a problem with either the stream or the source database.\n\nHow many tables can a single stream handle?\n-------------------------------------------\n\nWe recommend that a single stream includes up to 10,000 tables. There's no limit to the size of the tables. If you need to create a stream with more tables, then the stream might enter an error state. To avoid this, consider splitting the source into multiple streams.\n| Keep in mind the impact on the source database. Each stream will have its own limit on the number of connections and simultaneous tasks, so combining multiple streams could overwhelm the database."]]