Stay organized with collections
Save and categorize content based on your preferences.
This page gives you an overview of the BigQuery destination. It
describes write behavior and known limitations of Datastream when you
replicate data to BigQuery.
Write behavior
The maximum event size when you stream data into BigQuery is 20 MB.
When you configure your stream, you can select the way that Datastream
writes your change data to BigQuery. For more information, see
Configure write mode.
Table metadata
Datastream appends a STRUCT column named datastream_metadata to each
table that's written to the BigQuery destination.
Merge write mode
If a table has a primary key at the source, then the column contains the following fields:
UUID: This field has the STRING data type.
SOURCE_TIMESTAMP: This field has the INTEGER data type.
If a table doesn't have a primary key, then the column contains an additional field: IS_DELETED. This field has the BOOLEAN data type, and it indicates whether the data that Datastream streams to the destination is associated with a DELETE operation at the source. Tables without primary keys are append-only.
Append-only write mode
The datastream_metadata column contains the same fields for tables with and
without primary keys:
UUID: This field has the STRING data type.
SOURCE_TIMESTAMP: This field has the INTEGER data type.
CHANGE_SEQUENCE_NUMBER: This field has the STRING data type. It's an
internal sequence number used by Datastream for each change event.
CHANGE_TYPE: This field has the STRING data type. It indicates the type
of the change event: INSERT, UPDATE-INSERT, UPDATE-DELETEor DELETE.
SORT_KEYS: This field contains an array of STRING values. You can use the
values to sort the change events.
Control BigQuery costs
BigQuery costs are charged separately from Datastream. To learn
how to control your BigQuery costs, see
BigQuery CDC pricing.
Known limitations
Known limitations for using BigQuery as a destination include:
You can only replicate data into a BigQuery dataset that resides in
the same Google Cloud project as the Datastream stream.
By default, Datastream doesn't support adding a primary key to a
table that's already replicated to BigQuery without a primary key,
or removing a primary key from a table that's replicated to BigQuery
with a primary key. If you need to perform such changes, contact Google
Support. For information about changing the primary key definition for a
source table that already has a primary key, see Diagnose
issues.
Primary keys in BigQuery must be of the following data types:
DATE
BOOL
GEOGRAPHY
INT64
NUMERIC
BIGNUMERIC
STRING
TIMESTAMP
DATETIME
Tables that contain primary keys of unsupported data types aren't replicated
by Datastream.
BigQuery doesn't support table names with ., $, /, @ ,
or + characters. Datastream replaces such characters with
underscores when creating destination tables.
For example, table.name in the source database becomes table_name in
BigQuery.
For more information on table names in BigQuery, see Table
naming.
BigQuery doesn't support more than four clustering columns. When
replicating a table with more than four primary key columns,
Datastream uses four primary key columns as the clustering columns.
Datastream maps out-of-range date and time literals such as
PostgreSQL infinity date types to the following values:
Positive DATE to the value of 9999-12-31
Negative DATE to the value of 0001-01-01
Positive TIMESTAMP to the value of 9999-12-31 23:59:59.999000 UTC
Negative TIMESTAMP to the value of 0001-01-01 00:00:00 UTC
BigQuery doesn't support streaming tables which have primary keys
of FLOAT or REAL data types. Such tables aren't replicated.
To learn more about BigQuery date types and ranges, see Data
types.
If your source is Salesforce, the Dataset for each schema configuration
option isn't supported.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-29 UTC."],[[["\u003cp\u003eThis guide outlines the process of setting up a BigQuery destination to receive streamed data from a source database via Datastream.\u003c/p\u003e\n"],["\u003cp\u003eYou can choose to have Datastream create a separate dataset in BigQuery for each schema from the source database, or use a single designated dataset for all schemas.\u003c/p\u003e\n"],["\u003cp\u003eDatastream supports two write modes: "Merge," which keeps BigQuery in sync with the source without historical records, and "Append-only," which maintains a history of all data changes.\u003c/p\u003e\n"],["\u003cp\u003eEach table in the BigQuery destination will contain a \u003ccode\u003edatastream_metadata\u003c/code\u003e column, the fields of which will vary based on the table's primary key and write mode.\u003c/p\u003e\n"],["\u003cp\u003eDatastream includes mapping data types from various source databases (MySQL, Oracle, PostgreSQL, SQL Server, and Salesforce) to their corresponding BigQuery data types.\u003c/p\u003e\n"]]],[],null,["# BigQuery destination\n\nThis page gives you an overview of the BigQuery destination. It\ndescribes write behavior and known limitations of Datastream when you\nreplicate data to BigQuery.\n\nWrite behavior\n--------------\n\n- The maximum event size when you stream data into BigQuery is 20 MB.\n\n- When you configure your stream, you can select the way that Datastream\n writes your change data to BigQuery. For more information, see\n [Configure write mode](/datastream/docs/configure-bigquery-destination#configure-write-mode).\n\n### Table metadata\n\nDatastream appends a `STRUCT` column named `datastream_metadata` to each\ntable that's written to the BigQuery destination.\n\n#### Merge write mode\n\nIf a table has a primary key at the source, then the column contains the following fields:\n\n- `UUID`: This field has the `STRING` data type.\n- `SOURCE_TIMESTAMP`: This field has the `INTEGER` data type.\n\nIf a table doesn't have a primary key, then the column contains an additional field: `IS_DELETED`. This field has the `BOOLEAN` data type, and it indicates whether the data that Datastream streams to the destination is associated with a `DELETE` operation at the source. Tables without primary keys are append-only.\n\n#### Append-only write mode\n\nThe `datastream_metadata` column contains the same fields for tables with and\nwithout primary keys:\n\n- `UUID`: This field has the `STRING` data type.\n- `SOURCE_TIMESTAMP`: This field has the `INTEGER` data type.\n- `CHANGE_SEQUENCE_NUMBER`: This field has the `STRING` data type. It's an internal sequence number used by Datastream for each change event.\n- `CHANGE_TYPE`: This field has the `STRING` data type. It indicates the type of the change event: `INSERT`, `UPDATE-INSERT`, `UPDATE-DELETE`or `DELETE`.\n- `SORT_KEYS`: This field contains an array of `STRING` values. You can use the values to sort the change events.\n\n### Control BigQuery costs\n\nBigQuery costs are charged separately from Datastream. To learn\nhow to control your BigQuery costs, see\n[BigQuery CDC pricing](/bigquery/docs/change-data-capture#CDC_pricing).\n\nKnown limitations\n-----------------\n\n| **Note:** This section describes limitations for using Datastream with BigQuery as a destination. In addition to these limitations, you need to familiarize yourself with BigQuery change data capture limitations. For more information, see [Stream table updates with change data capture](/bigquery/docs/change-data-capture#limitations).\n\nKnown limitations for using BigQuery as a destination include:\n\n- You can only replicate data into a BigQuery dataset that resides in the same Google Cloud project as the Datastream stream.\n- By default, Datastream doesn't support adding a primary key to a table that's already replicated to BigQuery without a primary key, or removing a primary key from a table that's replicated to BigQuery with a primary key. If you need to perform such changes, contact Google Support. For information about *changing* the primary key definition for a source table that already has a primary key, see [Diagnose\n issues](/datastream/docs/diagnose-issues#bq-errors).\n- Primary keys in BigQuery must be of the following data types:\n\n - `DATE`\n - `BOOL`\n - `GEOGRAPHY`\n - `INT64`\n - `NUMERIC`\n - `BIGNUMERIC`\n - `STRING`\n - `TIMESTAMP`\n - `DATETIME`\n\n Tables that contain primary keys of unsupported data types aren't replicated\n by Datastream.\n- BigQuery doesn't support table names with `.`, `$`, `/`, `@` ,\n or `+` characters. Datastream replaces such characters with\n underscores when creating destination tables.\n\n For example, `table.name` in the source database becomes `table_name` in\n BigQuery.\n\n For more information on table names in BigQuery, see [Table\n naming](/bigquery/docs/tables#table_naming).\n- BigQuery doesn't support more than four clustering columns. When\n replicating a table with more than four primary key columns,\n Datastream uses four primary key columns as the clustering columns.\n\n- Datastream maps out-of-range date and time literals such as\n PostgreSQL infinity date types to the following values:\n\n - Positive `DATE` to the value of `9999-12-31`\n - Negative `DATE` to the value of `0001-01-01`\n - Positive `TIMESTAMP` to the value of `9999-12-31 23:59:59.999000 UTC`\n - Negative `TIMESTAMP` to the value of `0001-01-01 00:00:00 UTC`\n- BigQuery doesn't support streaming tables which have primary keys\n of `FLOAT` or `REAL` data types. Such tables aren't replicated.\n To learn more about BigQuery date types and ranges, see [Data\n types](/bigquery/docs/reference/standard-sql/data-types).\n\n- If your source is Salesforce, the **Dataset for each schema** configuration\n option isn't supported.\n\nWhat's next\n-----------\n\n- Learn how to [replicate data from a source database to BigQuery datasets\n using Datastream](/datastream/docs/quickstart-replication-to-bigquery)."]]