Stay organized with collections
Save and categorize content based on your preferences.
Datastream limits the maximum rates of incoming requests and enforces appropriate quotas on a per-project basis. Specific policies vary depending on resource availability, user profile, service usage history, and other factors, and are subject to change without notice.
Datastream has the following quota limits:
Each Google Cloud project can have a maximum of 50 stream resources per region.
Each Google Cloud project can have a maximum of 5 private connectivity configurations.
Each user can make up to 1,200 API calls per minute.
Salesforce API quotas
Salesforce defines a limit for the total number of inbound API requests
during a 24-hour period for an org. We recommend carefully considering your
polling interval values and the number of objects that you want to replicate.
To find out what limits there are in place for your org, use the Salesforce
Limits API.
Write to BigQuery and Cloud Storage
When writing data into BigQuery and Cloud Storage, the quotas and limits for BigQuery and Cloud Storage apply.
Cloud Logging
Datastream saves logs in Cloud Logging. The Logging quota applies to your Datastream resources.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-25 UTC."],[[["\u003cp\u003eDatastream limits incoming request rates and enforces quotas per project, with policies varying based on factors like resource availability and usage history.\u003c/p\u003e\n"],["\u003cp\u003eEach Google Cloud project is limited to a maximum of 50 stream resources per region and 5 private connectivity configurations, while each user can make up to 1,200 API calls per minute.\u003c/p\u003e\n"],["\u003cp\u003eSalesforce API limits apply per org, and reaching 90% of the quota will cause Datastream to throttle the stream, temporarily setting it to a \u003ccode\u003eFAILED\u003c/code\u003e state until usage decreases.\u003c/p\u003e\n"],["\u003cp\u003eWhen writing data to BigQuery and Cloud Storage, the respective quotas for those services also apply to Datastream.\u003c/p\u003e\n"],["\u003cp\u003eCloud Logging quota also applies to your Datastream resources.\u003c/p\u003e\n"]]],[],null,["# Quotas and limits\n\nDatastream limits the maximum rates of incoming requests and enforces appropriate quotas on a per-project basis. Specific policies vary depending on resource availability, user profile, service usage history, and other factors, and are subject to change without notice.\n\nDatastream has the following quota limits:\n\n- Each Google Cloud project can have a maximum of 50 stream resources per region.\n- Each Google Cloud project can have a maximum of 5 private connectivity configurations.\n- Each user can make up to 1,200 API calls per minute.\n\nSalesforce API quotas\n---------------------\n\n|\n| **Preview**\n|\n|\n| This feature is subject to the \"Pre-GA Offerings Terms\" in the General Service Terms section\n| of the [Service Specific Terms](/terms/service-terms#1).\n|\n| Pre-GA features are available \"as is\" and might have limited support.\n|\n| For more information, see the\n| [launch stage descriptions](/products#product-launch-stages).\n\nSalesforce defines a limit for the total number of inbound API requests\nduring a 24-hour period for an org. We recommend carefully considering your\npolling interval values and the number of objects that you want to replicate.\n\nTo find out what limits there are in place for your org, use the Salesforce\n[Limits API](https://developer.salesforce.com/docs/atlas.en-us.api_rest.meta/api_rest/resources_limits.htm).\n| **Note:** Limits are imposed on an org-basis, meaning that they affect the entire Salesforce org. If the API usage reaches 90% of the quota, Datastream throttles your stream, and the stream temporarily enters a `FAILED` state. After a short while, Datastream checks if the usage is under 90% and if so, the stream resumes.\n\nWrite to BigQuery and Cloud Storage\n-----------------------------------\n\nWhen writing data into BigQuery and Cloud Storage, the quotas and limits for [BigQuery](/bigquery/quotas) and [Cloud Storage](/storage/quotas) apply.\n\nCloud Logging\n-------------\n\nDatastream saves logs in [Cloud Logging](/logging). The Logging quota applies to your Datastream resources."]]