Class Model (0.7.1)

Model(mapping=None, *, ignore_unknown_fields=False, **kwargs)

A trained machine learning Model.

Attributes

Name Description
name str
The resource name of the Model.
display_name str
Required. The display name of the Model. The name can be up to 128 characters long and can be consist of any UTF-8 characters.
description str
The description of the Model.
predict_schemata google.cloud.aiplatform_v1beta1.types.PredictSchemata
The schemata that describe formats of the Model's predictions and explanations as given and returned via ``PredictionService.Predict`` and ``PredictionService.Explain``.
metadata_schema_uri str
Immutable. Points to a YAML file stored on Google Cloud Storage describing additional information about the Model, that is specific to it. Unset if the Model does not have any additional information. The schema is defined as an OpenAPI 3.0.2 `Schema Object
metadata google.protobuf.struct_pb2.Value
Immutable. An additional information about the Model; the schema of the metadata can be found in ``metadata_schema``. Unset if the Model does not have any additional information.
supported_export_formats Sequence[google.cloud.aiplatform_v1beta1.types.Model.ExportFormat]
Output only. The formats in which this Model may be exported. If empty, this Model is not available for export.
training_pipeline str
Output only. The resource name of the TrainingPipeline that uploaded this Model, if any.
container_spec google.cloud.aiplatform_v1beta1.types.ModelContainerSpec
Input only. The specification of the container that is to be used when deploying this Model. The specification is ingested upon ``ModelService.UploadModel``, and all binaries it contains are copied and stored internally by AI Platform. Not present for AutoML Models.
artifact_uri str
Immutable. The path to the directory containing the Model artifact and any of its supporting files. Not present for AutoML Models.
supported_deployment_resources_types Sequence[google.cloud.aiplatform_v1beta1.types.Model.DeploymentResourcesType]
Output only. When this Model is deployed, its prediction resources are described by the ``prediction_resources`` field of the ``Endpoint.deployed_models`` object. Because not all Models support all resource configuration types, the configuration types this Model supports are listed here. If no configuration types are listed, the Model cannot be deployed to an ``Endpoint`` and does not support online predictions (``PredictionService.Predict`` or ``PredictionService.Explain``). Such a Model can serve predictions by using a ``BatchPredictionJob``, if it has at least one entry each in ``supported_input_storage_formats`` and ``supported_output_storage_formats``.
supported_input_storage_formats Sequence[str]
Output only. The formats this Model supports in ``BatchPredictionJob.input_config``. If ``PredictSchemata.instance_schema_uri`` exists, the instances should be given as per that schema. The possible formats are: - ``jsonl`` The JSON Lines format, where each instance is a single line. Uses ``GcsSource``. - ``csv`` The CSV format, where each instance is a single comma-separated line. The first line in the file is the header, containing comma-separated field names. Uses ``GcsSource``. - ``tf-record`` The TFRecord format, where each instance is a single record in tfrecord syntax. Uses ``GcsSource``. - ``tf-record-gzip`` Similar to ``tf-record``, but the file is gzipped. Uses ``GcsSource``. - ``bigquery`` Each instance is a single row in BigQuery. Uses ``BigQuerySource``. - ``file-list`` Each line of the file is the location of an instance to process, uses ``gcs_source`` field of the ``InputConfig`` object. If this Model doesn't support any of these formats it means it cannot be used with a ``BatchPredictionJob``. However, if it has ``supported_deployment_resources_types``, it could serve online predictions by using ``PredictionService.Predict`` or ``PredictionService.Explain``.
supported_output_storage_formats Sequence[str]
Output only. The formats this Model supports in ``BatchPredictionJob.output_config``. If both ``PredictSchemata.instance_schema_uri`` and ``PredictSchemata.prediction_schema_uri`` exist, the predictions are returned together with their instances. In other words, the prediction has the original instance data first, followed by the actual prediction content (as per the schema). The possible formats are: - ``jsonl`` The JSON Lines format, where each prediction is a single line. Uses ``GcsDestination``. - ``csv`` The CSV format, where each prediction is a single comma-separated line. The first line in the file is the header, containing comma-separated field names. Uses ``GcsDestination``. - ``bigquery`` Each prediction is a single row in a BigQuery table, uses ``BigQueryDestination`` . If this Model doesn't support any of these formats it means it cannot be used with a ``BatchPredictionJob``. However, if it has ``supported_deployment_resources_types``, it could serve online predictions by using ``PredictionService.Predict`` or ``PredictionService.Explain``.
create_time google.protobuf.timestamp_pb2.Timestamp
Output only. Timestamp when this Model was uploaded into AI Platform.
update_time google.protobuf.timestamp_pb2.Timestamp
Output only. Timestamp when this Model was most recently updated.
deployed_models Sequence[google.cloud.aiplatform_v1beta1.types.DeployedModelRef]
Output only. The pointers to DeployedModels created from this Model. Note that Model could have been deployed to Endpoints in different Locations.
explanation_spec google.cloud.aiplatform_v1beta1.types.ExplanationSpec
The default explanation specification for this Model. The Model can be used for [requesting explanation][PredictionService.Explain] after being ``deployed`` if it is populated. The Model can be used for [batch explanation][BatchPredictionJob.generate_explanation] if it is populated. All fields of the explanation_spec can be overridden by ``explanation_spec`` of ``DeployModelRequest.deployed_model``, or ``explanation_spec`` of ``BatchPredictionJob``. If the default explanation specification is not set for this Model, this Model can still be used for [requesting explanation][PredictionService.Explain] by setting ``explanation_spec`` of ``DeployModelRequest.deployed_model`` and for [batch explanation][BatchPredictionJob.generate_explanation] by setting ``explanation_spec`` of ``BatchPredictionJob``.
etag str
Used to perform consistent read-modify-write updates. If not set, a blind "overwrite" update happens.
labels Sequence[google.cloud.aiplatform_v1beta1.types.Model.LabelsEntry]
The labels with user-defined metadata to organize your Models. Label keys and values can be no longer than 64 characters (Unicode codepoints), can only contain lowercase letters, numeric characters, underscores and dashes. International characters are allowed. See https://goo.gl/xmQnxf for more information and examples of labels.
encryption_spec google.cloud.aiplatform_v1beta1.types.EncryptionSpec
Customer-managed encryption key spec for a Model. If set, this Model and all sub-resources of this Model will be secured by this key.

Inheritance

builtins.object > proto.message.Message > Model

Classes

DeploymentResourcesType

DeploymentResourcesType(value)

Identifies a type of Model's prediction resources.

ExportFormat

ExportFormat(mapping=None, *, ignore_unknown_fields=False, **kwargs)

Represents export format supported by the Model. All formats export to Google Cloud Storage.

LabelsEntry

LabelsEntry(mapping=None, *, ignore_unknown_fields=False, **kwargs)

The abstract base class for a message.

Parameters
Name Description
kwargs dict

Keys and values corresponding to the fields of the message.

mapping Union[dict, `.Message`]

A dictionary or message to be used to determine the values for this message.

ignore_unknown_fields Optional(bool)

If True, do not raise errors for unknown fields. Only applied if mapping is a mapping type or there are keyword parameters.