Class EvaluationClassificationMetric (1.54.0)

EvaluationClassificationMetric(
    label_name: typing.Optional[str] = None,
    auPrc: typing.Optional[float] = None,
    auRoc: typing.Optional[float] = None,
    logLoss: typing.Optional[float] = None,
    confidenceMetrics: typing.Optional[
        typing.List[typing.Dict[str, typing.Any]]
    ] = None,
    confusionMatrix: typing.Optional[typing.Dict[str, typing.Any]] = None,
)

The evaluation metric response for classification metrics.

Parameters

Name Description
label_name str

Optional. The name of the label associated with the metrics. This is only returned when only_summary_metrics=False is passed to evaluate().

auPrc float

Optional. The area under the precision recall curve.

auRoc float

Optional. The area under the receiver operating characteristic curve.

logLoss float

Optional. Logarithmic loss.

confidenceMetrics List[Dict[str, Any]]

Optional. This is only returned when only_summary_metrics=False is passed to evaluate().

confusionMatrix Dict[str, Any]

Optional. This is only returned when only_summary_metrics=False is passed to evaluate().

Properties

input_dataset_paths

The Google Cloud Storage paths to the dataset used for this evaluation.

task_name

The type of evaluation task for the evaluation..