- 2.53.0 (latest)
- 2.52.0
- 2.51.0
- 2.49.0
- 2.48.0
- 2.47.0
- 2.46.0
- 2.45.0
- 2.44.0
- 2.43.0
- 2.42.0
- 2.41.0
- 2.40.0
- 2.39.0
- 2.37.0
- 2.36.0
- 2.35.0
- 2.34.0
- 2.33.0
- 2.32.0
- 2.31.0
- 2.30.0
- 2.29.0
- 2.28.0
- 2.27.0
- 2.24.0
- 2.23.0
- 2.22.0
- 2.21.0
- 2.20.0
- 2.19.0
- 2.18.0
- 2.17.0
- 2.16.0
- 2.15.0
- 2.14.0
- 2.13.0
- 2.12.0
- 2.11.0
- 2.10.0
- 2.9.0
- 2.8.0
- 2.7.0
- 2.6.0
- 2.5.0
- 2.4.0
- 2.3.17
- 2.2.6
- 2.1.23
- 2.0.27
public interface DataSourceDefinitionOrBuilder extends MessageOrBuilder
Implements
MessageOrBuilderMethods
getDataSource()
public abstract DataSource getDataSource()
Data source metadata.
.google.cloud.bigquery.datatransfer.v1.DataSource data_source = 1;
Returns | |
---|---|
Type | Description |
DataSource |
getDataSourceOrBuilder()
public abstract DataSourceOrBuilder getDataSourceOrBuilder()
Data source metadata.
.google.cloud.bigquery.datatransfer.v1.DataSource data_source = 1;
Returns | |
---|---|
Type | Description |
DataSourceOrBuilder |
getDisabled()
public abstract boolean getDisabled()
Is data source disabled? If true, data_source is not visible. API will also stop returning any data transfer configs and/or runs associated with the data source. This setting has higher priority than whitelisted_project_ids.
bool disabled = 5;
Returns | |
---|---|
Type | Description |
boolean |
getName()
public abstract String getName()
The resource name of the data source definition.
Data source definition names have the form
projects/{project_id}/locations/{location}/dataSourceDefinitions/{data_source_id}
.
string name = 21;
Returns | |
---|---|
Type | Description |
String |
getNameBytes()
public abstract ByteString getNameBytes()
The resource name of the data source definition.
Data source definition names have the form
projects/{project_id}/locations/{location}/dataSourceDefinitions/{data_source_id}
.
string name = 21;
Returns | |
---|---|
Type | Description |
ByteString |
getRunTimeOffset()
public abstract Duration getRunTimeOffset()
Duration which should be added to schedule_time to calculate run_time when job is scheduled. Only applicable for automatically scheduled transfer runs. Used to start a run early on a data source that supports continuous data refresh to compensate for unknown timezone offsets. Use a negative number to start a run late for data sources not supporting continuous data refresh.
.google.protobuf.Duration run_time_offset = 16;
Returns | |
---|---|
Type | Description |
Duration |
getRunTimeOffsetOrBuilder()
public abstract DurationOrBuilder getRunTimeOffsetOrBuilder()
Duration which should be added to schedule_time to calculate run_time when job is scheduled. Only applicable for automatically scheduled transfer runs. Used to start a run early on a data source that supports continuous data refresh to compensate for unknown timezone offsets. Use a negative number to start a run late for data sources not supporting continuous data refresh.
.google.protobuf.Duration run_time_offset = 16;
Returns | |
---|---|
Type | Description |
DurationOrBuilder |
getServiceAccount()
public abstract String getServiceAccount()
When service account is specified, BigQuery will share created dataset with the given service account. Also, this service account will be eligible to perform status updates and message logging for data transfer runs for the corresponding data_source_id.
string service_account = 2;
Returns | |
---|---|
Type | Description |
String |
getServiceAccountBytes()
public abstract ByteString getServiceAccountBytes()
When service account is specified, BigQuery will share created dataset with the given service account. Also, this service account will be eligible to perform status updates and message logging for data transfer runs for the corresponding data_source_id.
string service_account = 2;
Returns | |
---|---|
Type | Description |
ByteString |
getSupportEmail()
public abstract String getSupportEmail()
Support e-mail address of the OAuth client's Brand, which contains the consent screen data.
string support_email = 22;
Returns | |
---|---|
Type | Description |
String |
getSupportEmailBytes()
public abstract ByteString getSupportEmailBytes()
Support e-mail address of the OAuth client's Brand, which contains the consent screen data.
string support_email = 22;
Returns | |
---|---|
Type | Description |
ByteString |
getSupportedLocationIds(int index)
public abstract String getSupportedLocationIds(int index)
Supported location_ids used for deciding in which locations Pub/Sub topics
need to be created. If custom Pub/Sub topics are used and they contains
'{location}', the location_ids will be used for validating the topics by
replacing the '{location}' with the individual location in the list. The
valid values are the "location_id" field of the response of GET
https://bigquerydatatransfer.googleapis.com/v1/{name=projects/*}/locations
In addition, if the data source needs to support all available regions,
supported_location_ids can be set to "global" (a single string element).
When "global" is specified:
1) the data source implementation is supposed to stage the data in proper
region of the destination dataset;
2) Data source developer should be aware of the implications (e.g., network
traffic latency, potential charge associated with cross-region traffic,
etc.) of supporting the "global" region;
repeated string supported_location_ids = 23;
Parameter | |
---|---|
Name | Description |
index | int |
Returns | |
---|---|
Type | Description |
String |
getSupportedLocationIdsBytes(int index)
public abstract ByteString getSupportedLocationIdsBytes(int index)
Supported location_ids used for deciding in which locations Pub/Sub topics
need to be created. If custom Pub/Sub topics are used and they contains
'{location}', the location_ids will be used for validating the topics by
replacing the '{location}' with the individual location in the list. The
valid values are the "location_id" field of the response of GET
https://bigquerydatatransfer.googleapis.com/v1/{name=projects/*}/locations
In addition, if the data source needs to support all available regions,
supported_location_ids can be set to "global" (a single string element).
When "global" is specified:
1) the data source implementation is supposed to stage the data in proper
region of the destination dataset;
2) Data source developer should be aware of the implications (e.g., network
traffic latency, potential charge associated with cross-region traffic,
etc.) of supporting the "global" region;
repeated string supported_location_ids = 23;
Parameter | |
---|---|
Name | Description |
index | int |
Returns | |
---|---|
Type | Description |
ByteString |
getSupportedLocationIdsCount()
public abstract int getSupportedLocationIdsCount()
Supported location_ids used for deciding in which locations Pub/Sub topics
need to be created. If custom Pub/Sub topics are used and they contains
'{location}', the location_ids will be used for validating the topics by
replacing the '{location}' with the individual location in the list. The
valid values are the "location_id" field of the response of GET
https://bigquerydatatransfer.googleapis.com/v1/{name=projects/*}/locations
In addition, if the data source needs to support all available regions,
supported_location_ids can be set to "global" (a single string element).
When "global" is specified:
1) the data source implementation is supposed to stage the data in proper
region of the destination dataset;
2) Data source developer should be aware of the implications (e.g., network
traffic latency, potential charge associated with cross-region traffic,
etc.) of supporting the "global" region;
repeated string supported_location_ids = 23;
Returns | |
---|---|
Type | Description |
int |
getSupportedLocationIdsList()
public abstract List<String> getSupportedLocationIdsList()
Supported location_ids used for deciding in which locations Pub/Sub topics
need to be created. If custom Pub/Sub topics are used and they contains
'{location}', the location_ids will be used for validating the topics by
replacing the '{location}' with the individual location in the list. The
valid values are the "location_id" field of the response of GET
https://bigquerydatatransfer.googleapis.com/v1/{name=projects/*}/locations
In addition, if the data source needs to support all available regions,
supported_location_ids can be set to "global" (a single string element).
When "global" is specified:
1) the data source implementation is supposed to stage the data in proper
region of the destination dataset;
2) Data source developer should be aware of the implications (e.g., network
traffic latency, potential charge associated with cross-region traffic,
etc.) of supporting the "global" region;
repeated string supported_location_ids = 23;
Returns | |
---|---|
Type | Description |
List<String> |
getTransferConfigPubsubTopic()
public abstract String getTransferConfigPubsubTopic()
The Pub/Sub topic to use for broadcasting a message for transfer config. If empty, a message will not be broadcasted. Both this topic and transfer_run_pubsub_topic are auto-generated if none of them is provided when creating the definition. It is recommended to provide transfer_config_pubsub_topic if a user-owned transfer_run_pubsub_topic is provided. Otherwise, it will be set to empty. If "{location}" is found in the value, then that means, data source wants to handle message separately for datasets in different regions. We will replace {location} with the actual dataset location, as the actual topic name. For example, projects/connector/topics/scheduler-{location} could become projects/connector/topics/scheduler-us. If "{location}" is not found, then we will use the input value as topic name.
string transfer_config_pubsub_topic = 12;
Returns | |
---|---|
Type | Description |
String |
getTransferConfigPubsubTopicBytes()
public abstract ByteString getTransferConfigPubsubTopicBytes()
The Pub/Sub topic to use for broadcasting a message for transfer config. If empty, a message will not be broadcasted. Both this topic and transfer_run_pubsub_topic are auto-generated if none of them is provided when creating the definition. It is recommended to provide transfer_config_pubsub_topic if a user-owned transfer_run_pubsub_topic is provided. Otherwise, it will be set to empty. If "{location}" is found in the value, then that means, data source wants to handle message separately for datasets in different regions. We will replace {location} with the actual dataset location, as the actual topic name. For example, projects/connector/topics/scheduler-{location} could become projects/connector/topics/scheduler-us. If "{location}" is not found, then we will use the input value as topic name.
string transfer_config_pubsub_topic = 12;
Returns | |
---|---|
Type | Description |
ByteString |
getTransferRunPubsubTopic()
public abstract String getTransferRunPubsubTopic()
The Pub/Sub topic to be used for broadcasting a message when a transfer run is created. Both this topic and transfer_config_pubsub_topic can be set to a custom topic. By default, both topics are auto-generated if none of them is provided when creating the definition. However, if one topic is manually set, the other topic has to be manually set as well. The only difference is that transfer_run_pubsub_topic must be a non-empty Pub/Sub topic, but transfer_config_pubsub_topic can be set to empty. The comments about "{location}" for transfer_config_pubsub_topic apply here too.
string transfer_run_pubsub_topic = 13;
Returns | |
---|---|
Type | Description |
String |
getTransferRunPubsubTopicBytes()
public abstract ByteString getTransferRunPubsubTopicBytes()
The Pub/Sub topic to be used for broadcasting a message when a transfer run is created. Both this topic and transfer_config_pubsub_topic can be set to a custom topic. By default, both topics are auto-generated if none of them is provided when creating the definition. However, if one topic is manually set, the other topic has to be manually set as well. The only difference is that transfer_run_pubsub_topic must be a non-empty Pub/Sub topic, but transfer_config_pubsub_topic can be set to empty. The comments about "{location}" for transfer_config_pubsub_topic apply here too.
string transfer_run_pubsub_topic = 13;
Returns | |
---|---|
Type | Description |
ByteString |
hasDataSource()
public abstract boolean hasDataSource()
Data source metadata.
.google.cloud.bigquery.datatransfer.v1.DataSource data_source = 1;
Returns | |
---|---|
Type | Description |
boolean |
hasRunTimeOffset()
public abstract boolean hasRunTimeOffset()
Duration which should be added to schedule_time to calculate run_time when job is scheduled. Only applicable for automatically scheduled transfer runs. Used to start a run early on a data source that supports continuous data refresh to compensate for unknown timezone offsets. Use a negative number to start a run late for data sources not supporting continuous data refresh.
.google.protobuf.Duration run_time_offset = 16;
Returns | |
---|---|
Type | Description |
boolean |