Reference documentation and code samples for the Dataplex V1 API class Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec.
Job specification for a metadata import job.
You can run the following kinds of metadata import jobs:
- Full sync of entries with incremental import of their aspects. Supported for custom entries.
- Incremental import of aspects only. Supported for aspects that belong to custom entries and system entries. For custom entries, you can modify both optional aspects and required aspects. For system entries, you can modify optional aspects.
Inherits
- Object
Extended By
- Google::Protobuf::MessageExts::ClassMethods
Includes
- Google::Protobuf::MessageExts
Methods
#aspect_sync_mode
def aspect_sync_mode() -> ::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::SyncMode
- (::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::SyncMode) — Required. The sync mode for aspects.
#aspect_sync_mode=
def aspect_sync_mode=(value) -> ::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::SyncMode
- value (::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::SyncMode) — Required. The sync mode for aspects.
- (::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::SyncMode) — Required. The sync mode for aspects.
#entry_sync_mode
def entry_sync_mode() -> ::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::SyncMode
- (::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::SyncMode) — Required. The sync mode for entries.
#entry_sync_mode=
def entry_sync_mode=(value) -> ::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::SyncMode
- value (::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::SyncMode) — Required. The sync mode for entries.
- (::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::SyncMode) — Required. The sync mode for entries.
#log_level
def log_level() -> ::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::LogLevel
-
(::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::LogLevel) — Optional. The level of logs to write to Cloud Logging for this job.
Debug-level logs provide highly-detailed information for troubleshooting, but their increased verbosity could incur additional costs that might not be merited for all jobs.
If unspecified, defaults to
INFO
.
#log_level=
def log_level=(value) -> ::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::LogLevel
-
value (::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::LogLevel) — Optional. The level of logs to write to Cloud Logging for this job.
Debug-level logs provide highly-detailed information for troubleshooting, but their increased verbosity could incur additional costs that might not be merited for all jobs.
If unspecified, defaults to
INFO
.
-
(::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::LogLevel) — Optional. The level of logs to write to Cloud Logging for this job.
Debug-level logs provide highly-detailed information for troubleshooting, but their increased verbosity could incur additional costs that might not be merited for all jobs.
If unspecified, defaults to
INFO
.
#scope
def scope() -> ::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::ImportJobScope
- (::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::ImportJobScope) — Required. A boundary on the scope of impact that the metadata import job can have.
#scope=
def scope=(value) -> ::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::ImportJobScope
- value (::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::ImportJobScope) — Required. A boundary on the scope of impact that the metadata import job can have.
- (::Google::Cloud::Dataplex::V1::MetadataJob::ImportJobSpec::ImportJobScope) — Required. A boundary on the scope of impact that the metadata import job can have.
#source_create_time
def source_create_time() -> ::Google::Protobuf::Timestamp
- (::Google::Protobuf::Timestamp) — Optional. The time when the process that created the metadata import files began.
#source_create_time=
def source_create_time=(value) -> ::Google::Protobuf::Timestamp
- value (::Google::Protobuf::Timestamp) — Optional. The time when the process that created the metadata import files began.
- (::Google::Protobuf::Timestamp) — Optional. The time when the process that created the metadata import files began.
#source_storage_uri
def source_storage_uri() -> ::String
-
(::String) — Optional. The URI of a Cloud Storage bucket or folder (beginning with
gs://
and ending with/
) that contains the metadata import files for this job.A metadata import file defines the values to set for each of the entries and aspects in a metadata job. For more information about how to create a metadata import file and the file requirements, see Metadata import file.
You can provide multiple metadata import files in the same metadata job. The bucket or folder must contain at least one metadata import file, in JSON Lines format (either
.json
or.jsonl
file extension).In
FULL
entry sync mode, don't save the metadata import file in a folder namedSOURCE_STORAGE_URI/deletions/
.Caution: If the metadata import file contains no data, all entries and aspects that belong to the job's scope are deleted.
#source_storage_uri=
def source_storage_uri=(value) -> ::String
-
value (::String) — Optional. The URI of a Cloud Storage bucket or folder (beginning with
gs://
and ending with/
) that contains the metadata import files for this job.A metadata import file defines the values to set for each of the entries and aspects in a metadata job. For more information about how to create a metadata import file and the file requirements, see Metadata import file.
You can provide multiple metadata import files in the same metadata job. The bucket or folder must contain at least one metadata import file, in JSON Lines format (either
.json
or.jsonl
file extension).In
FULL
entry sync mode, don't save the metadata import file in a folder namedSOURCE_STORAGE_URI/deletions/
.Caution: If the metadata import file contains no data, all entries and aspects that belong to the job's scope are deleted.
-
(::String) — Optional. The URI of a Cloud Storage bucket or folder (beginning with
gs://
and ending with/
) that contains the metadata import files for this job.A metadata import file defines the values to set for each of the entries and aspects in a metadata job. For more information about how to create a metadata import file and the file requirements, see Metadata import file.
You can provide multiple metadata import files in the same metadata job. The bucket or folder must contain at least one metadata import file, in JSON Lines format (either
.json
or.jsonl
file extension).In
FULL
entry sync mode, don't save the metadata import file in a folder namedSOURCE_STORAGE_URI/deletions/
.Caution: If the metadata import file contains no data, all entries and aspects that belong to the job's scope are deleted.