public static final class DataProfileAction.Export.Builder extends GeneratedMessageV3.Builder<DataProfileAction.Export.Builder> implements DataProfileAction.ExportOrBuilder
If set, the detailed data profiles will be persisted to the location
of your choice whenever updated.
Protobuf type google.privacy.dlp.v2.DataProfileAction.Export
Inherited Members
com.google.protobuf.GeneratedMessageV3.Builder.getUnknownFieldSetBuilder()
com.google.protobuf.GeneratedMessageV3.Builder.internalGetMapFieldReflection(int)
com.google.protobuf.GeneratedMessageV3.Builder.internalGetMutableMapFieldReflection(int)
com.google.protobuf.GeneratedMessageV3.Builder.mergeUnknownLengthDelimitedField(int,com.google.protobuf.ByteString)
com.google.protobuf.GeneratedMessageV3.Builder.mergeUnknownVarintField(int,int)
com.google.protobuf.GeneratedMessageV3.Builder.parseUnknownField(com.google.protobuf.CodedInputStream,com.google.protobuf.ExtensionRegistryLite,int)
com.google.protobuf.GeneratedMessageV3.Builder.setUnknownFieldSetBuilder(com.google.protobuf.UnknownFieldSet.Builder)
Static Methods
getDescriptor()
public static final Descriptors.Descriptor getDescriptor()
Methods
addRepeatedField(Descriptors.FieldDescriptor field, Object value)
public DataProfileAction.Export.Builder addRepeatedField(Descriptors.FieldDescriptor field, Object value)
Overrides
build()
public DataProfileAction.Export build()
buildPartial()
public DataProfileAction.Export buildPartial()
clear()
public DataProfileAction.Export.Builder clear()
Overrides
clearField(Descriptors.FieldDescriptor field)
public DataProfileAction.Export.Builder clearField(Descriptors.FieldDescriptor field)
Overrides
clearOneof(Descriptors.OneofDescriptor oneof)
public DataProfileAction.Export.Builder clearOneof(Descriptors.OneofDescriptor oneof)
Overrides
clearProfileTable()
public DataProfileAction.Export.Builder clearProfileTable()
Store all profiles to BigQuery.
- The system will create a new dataset and table for you if none are
are provided. The dataset will be named
sensitive_data_protection_discovery
and table will be named
discovery_profiles
. This table will be placed in the same project as
the container project running the scan. After the first profile is
generated and the dataset and table are created, the discovery scan
configuration will be updated with the dataset and table names.
- See Analyze data profiles stored in
BigQuery.
- See Sample queries for your BigQuery
table.
- Data is inserted using streaming
insert
and so data may be in the buffer for a period of time after the
profile has finished.
- The Pub/Sub notification is sent before the streaming buffer is
guaranteed to be written, so data may not be instantly
visible to queries by the time your topic receives the Pub/Sub
notification.
- The best practice is to use the same table for an entire organization
so that you can take advantage of the provided Looker
reports.
If you use VPC Service Controls to define security perimeters, then
you must use a separate table for each boundary.
.google.privacy.dlp.v2.BigQueryTable profile_table = 1;
clearSampleFindingsTable()
public DataProfileAction.Export.Builder clearSampleFindingsTable()
Store sample data profile
findings in an existing table
or a new table in an existing dataset. Each regeneration will result in
new rows in BigQuery. Data is inserted using streaming
insert
and so data may be in the buffer for a period of time after the profile
has finished.
.google.privacy.dlp.v2.BigQueryTable sample_findings_table = 2;
clone()
public DataProfileAction.Export.Builder clone()
Overrides
getDefaultInstanceForType()
public DataProfileAction.Export getDefaultInstanceForType()
getDescriptorForType()
public Descriptors.Descriptor getDescriptorForType()
Overrides
getProfileTable()
public BigQueryTable getProfileTable()
Store all profiles to BigQuery.
- The system will create a new dataset and table for you if none are
are provided. The dataset will be named
sensitive_data_protection_discovery
and table will be named
discovery_profiles
. This table will be placed in the same project as
the container project running the scan. After the first profile is
generated and the dataset and table are created, the discovery scan
configuration will be updated with the dataset and table names.
- See Analyze data profiles stored in
BigQuery.
- See Sample queries for your BigQuery
table.
- Data is inserted using streaming
insert
and so data may be in the buffer for a period of time after the
profile has finished.
- The Pub/Sub notification is sent before the streaming buffer is
guaranteed to be written, so data may not be instantly
visible to queries by the time your topic receives the Pub/Sub
notification.
- The best practice is to use the same table for an entire organization
so that you can take advantage of the provided Looker
reports.
If you use VPC Service Controls to define security perimeters, then
you must use a separate table for each boundary.
.google.privacy.dlp.v2.BigQueryTable profile_table = 1;
getProfileTableBuilder()
public BigQueryTable.Builder getProfileTableBuilder()
Store all profiles to BigQuery.
- The system will create a new dataset and table for you if none are
are provided. The dataset will be named
sensitive_data_protection_discovery
and table will be named
discovery_profiles
. This table will be placed in the same project as
the container project running the scan. After the first profile is
generated and the dataset and table are created, the discovery scan
configuration will be updated with the dataset and table names.
- See Analyze data profiles stored in
BigQuery.
- See Sample queries for your BigQuery
table.
- Data is inserted using streaming
insert
and so data may be in the buffer for a period of time after the
profile has finished.
- The Pub/Sub notification is sent before the streaming buffer is
guaranteed to be written, so data may not be instantly
visible to queries by the time your topic receives the Pub/Sub
notification.
- The best practice is to use the same table for an entire organization
so that you can take advantage of the provided Looker
reports.
If you use VPC Service Controls to define security perimeters, then
you must use a separate table for each boundary.
.google.privacy.dlp.v2.BigQueryTable profile_table = 1;
getProfileTableOrBuilder()
public BigQueryTableOrBuilder getProfileTableOrBuilder()
Store all profiles to BigQuery.
- The system will create a new dataset and table for you if none are
are provided. The dataset will be named
sensitive_data_protection_discovery
and table will be named
discovery_profiles
. This table will be placed in the same project as
the container project running the scan. After the first profile is
generated and the dataset and table are created, the discovery scan
configuration will be updated with the dataset and table names.
- See Analyze data profiles stored in
BigQuery.
- See Sample queries for your BigQuery
table.
- Data is inserted using streaming
insert
and so data may be in the buffer for a period of time after the
profile has finished.
- The Pub/Sub notification is sent before the streaming buffer is
guaranteed to be written, so data may not be instantly
visible to queries by the time your topic receives the Pub/Sub
notification.
- The best practice is to use the same table for an entire organization
so that you can take advantage of the provided Looker
reports.
If you use VPC Service Controls to define security perimeters, then
you must use a separate table for each boundary.
.google.privacy.dlp.v2.BigQueryTable profile_table = 1;
getSampleFindingsTable()
public BigQueryTable getSampleFindingsTable()
Store sample data profile
findings in an existing table
or a new table in an existing dataset. Each regeneration will result in
new rows in BigQuery. Data is inserted using streaming
insert
and so data may be in the buffer for a period of time after the profile
has finished.
.google.privacy.dlp.v2.BigQueryTable sample_findings_table = 2;
getSampleFindingsTableBuilder()
public BigQueryTable.Builder getSampleFindingsTableBuilder()
Store sample data profile
findings in an existing table
or a new table in an existing dataset. Each regeneration will result in
new rows in BigQuery. Data is inserted using streaming
insert
and so data may be in the buffer for a period of time after the profile
has finished.
.google.privacy.dlp.v2.BigQueryTable sample_findings_table = 2;
getSampleFindingsTableOrBuilder()
public BigQueryTableOrBuilder getSampleFindingsTableOrBuilder()
Store sample data profile
findings in an existing table
or a new table in an existing dataset. Each regeneration will result in
new rows in BigQuery. Data is inserted using streaming
insert
and so data may be in the buffer for a period of time after the profile
has finished.
.google.privacy.dlp.v2.BigQueryTable sample_findings_table = 2;
hasProfileTable()
public boolean hasProfileTable()
Store all profiles to BigQuery.
- The system will create a new dataset and table for you if none are
are provided. The dataset will be named
sensitive_data_protection_discovery
and table will be named
discovery_profiles
. This table will be placed in the same project as
the container project running the scan. After the first profile is
generated and the dataset and table are created, the discovery scan
configuration will be updated with the dataset and table names.
- See Analyze data profiles stored in
BigQuery.
- See Sample queries for your BigQuery
table.
- Data is inserted using streaming
insert
and so data may be in the buffer for a period of time after the
profile has finished.
- The Pub/Sub notification is sent before the streaming buffer is
guaranteed to be written, so data may not be instantly
visible to queries by the time your topic receives the Pub/Sub
notification.
- The best practice is to use the same table for an entire organization
so that you can take advantage of the provided Looker
reports.
If you use VPC Service Controls to define security perimeters, then
you must use a separate table for each boundary.
.google.privacy.dlp.v2.BigQueryTable profile_table = 1;
Returns |
Type |
Description |
boolean |
Whether the profileTable field is set.
|
hasSampleFindingsTable()
public boolean hasSampleFindingsTable()
Store sample data profile
findings in an existing table
or a new table in an existing dataset. Each regeneration will result in
new rows in BigQuery. Data is inserted using streaming
insert
and so data may be in the buffer for a period of time after the profile
has finished.
.google.privacy.dlp.v2.BigQueryTable sample_findings_table = 2;
Returns |
Type |
Description |
boolean |
Whether the sampleFindingsTable field is set.
|
internalGetFieldAccessorTable()
protected GeneratedMessageV3.FieldAccessorTable internalGetFieldAccessorTable()
Overrides
isInitialized()
public final boolean isInitialized()
Overrides
mergeFrom(DataProfileAction.Export other)
public DataProfileAction.Export.Builder mergeFrom(DataProfileAction.Export other)
public DataProfileAction.Export.Builder mergeFrom(CodedInputStream input, ExtensionRegistryLite extensionRegistry)
Overrides
mergeFrom(Message other)
public DataProfileAction.Export.Builder mergeFrom(Message other)
Parameter |
Name |
Description |
other |
Message
|
Overrides
mergeProfileTable(BigQueryTable value)
public DataProfileAction.Export.Builder mergeProfileTable(BigQueryTable value)
Store all profiles to BigQuery.
- The system will create a new dataset and table for you if none are
are provided. The dataset will be named
sensitive_data_protection_discovery
and table will be named
discovery_profiles
. This table will be placed in the same project as
the container project running the scan. After the first profile is
generated and the dataset and table are created, the discovery scan
configuration will be updated with the dataset and table names.
- See Analyze data profiles stored in
BigQuery.
- See Sample queries for your BigQuery
table.
- Data is inserted using streaming
insert
and so data may be in the buffer for a period of time after the
profile has finished.
- The Pub/Sub notification is sent before the streaming buffer is
guaranteed to be written, so data may not be instantly
visible to queries by the time your topic receives the Pub/Sub
notification.
- The best practice is to use the same table for an entire organization
so that you can take advantage of the provided Looker
reports.
If you use VPC Service Controls to define security perimeters, then
you must use a separate table for each boundary.
.google.privacy.dlp.v2.BigQueryTable profile_table = 1;
mergeSampleFindingsTable(BigQueryTable value)
public DataProfileAction.Export.Builder mergeSampleFindingsTable(BigQueryTable value)
Store sample data profile
findings in an existing table
or a new table in an existing dataset. Each regeneration will result in
new rows in BigQuery. Data is inserted using streaming
insert
and so data may be in the buffer for a period of time after the profile
has finished.
.google.privacy.dlp.v2.BigQueryTable sample_findings_table = 2;
mergeUnknownFields(UnknownFieldSet unknownFields)
public final DataProfileAction.Export.Builder mergeUnknownFields(UnknownFieldSet unknownFields)
Overrides
setField(Descriptors.FieldDescriptor field, Object value)
public DataProfileAction.Export.Builder setField(Descriptors.FieldDescriptor field, Object value)
Overrides
setProfileTable(BigQueryTable value)
public DataProfileAction.Export.Builder setProfileTable(BigQueryTable value)
Store all profiles to BigQuery.
- The system will create a new dataset and table for you if none are
are provided. The dataset will be named
sensitive_data_protection_discovery
and table will be named
discovery_profiles
. This table will be placed in the same project as
the container project running the scan. After the first profile is
generated and the dataset and table are created, the discovery scan
configuration will be updated with the dataset and table names.
- See Analyze data profiles stored in
BigQuery.
- See Sample queries for your BigQuery
table.
- Data is inserted using streaming
insert
and so data may be in the buffer for a period of time after the
profile has finished.
- The Pub/Sub notification is sent before the streaming buffer is
guaranteed to be written, so data may not be instantly
visible to queries by the time your topic receives the Pub/Sub
notification.
- The best practice is to use the same table for an entire organization
so that you can take advantage of the provided Looker
reports.
If you use VPC Service Controls to define security perimeters, then
you must use a separate table for each boundary.
.google.privacy.dlp.v2.BigQueryTable profile_table = 1;
setProfileTable(BigQueryTable.Builder builderForValue)
public DataProfileAction.Export.Builder setProfileTable(BigQueryTable.Builder builderForValue)
Store all profiles to BigQuery.
- The system will create a new dataset and table for you if none are
are provided. The dataset will be named
sensitive_data_protection_discovery
and table will be named
discovery_profiles
. This table will be placed in the same project as
the container project running the scan. After the first profile is
generated and the dataset and table are created, the discovery scan
configuration will be updated with the dataset and table names.
- See Analyze data profiles stored in
BigQuery.
- See Sample queries for your BigQuery
table.
- Data is inserted using streaming
insert
and so data may be in the buffer for a period of time after the
profile has finished.
- The Pub/Sub notification is sent before the streaming buffer is
guaranteed to be written, so data may not be instantly
visible to queries by the time your topic receives the Pub/Sub
notification.
- The best practice is to use the same table for an entire organization
so that you can take advantage of the provided Looker
reports.
If you use VPC Service Controls to define security perimeters, then
you must use a separate table for each boundary.
.google.privacy.dlp.v2.BigQueryTable profile_table = 1;
setRepeatedField(Descriptors.FieldDescriptor field, int index, Object value)
public DataProfileAction.Export.Builder setRepeatedField(Descriptors.FieldDescriptor field, int index, Object value)
Overrides
setSampleFindingsTable(BigQueryTable value)
public DataProfileAction.Export.Builder setSampleFindingsTable(BigQueryTable value)
Store sample data profile
findings in an existing table
or a new table in an existing dataset. Each regeneration will result in
new rows in BigQuery. Data is inserted using streaming
insert
and so data may be in the buffer for a period of time after the profile
has finished.
.google.privacy.dlp.v2.BigQueryTable sample_findings_table = 2;
setSampleFindingsTable(BigQueryTable.Builder builderForValue)
public DataProfileAction.Export.Builder setSampleFindingsTable(BigQueryTable.Builder builderForValue)
Store sample data profile
findings in an existing table
or a new table in an existing dataset. Each regeneration will result in
new rows in BigQuery. Data is inserted using streaming
insert
and so data may be in the buffer for a period of time after the profile
has finished.
.google.privacy.dlp.v2.BigQueryTable sample_findings_table = 2;
setUnknownFields(UnknownFieldSet unknownFields)
public final DataProfileAction.Export.Builder setUnknownFields(UnknownFieldSet unknownFields)
Overrides