A Vertex AI pode avaliar modelos treinados
pelo AutoML ou por treinamento personalizado. Para o guia do console Google Cloud , você precisa ter um modelo treinado importado para o Vertex AI Model Registry.
Faça o upload do conjunto de dados de teste para o BigQuery ou
Cloud Storage. O conjunto de dados de teste precisa ter a verdade, que é o resultado real esperado para uma inferência. Consiga o link para o arquivo ou o ID do conjunto de dados.
Selecione um Objetivo, como classificação ou regressão.
Insira o Nome da coluna de destino da avaliação, que é a coluna dos dados de treinamento que o modelo é treinado para prever.
Em Selecionar origem, escolha a origem do conjunto de dados de teste.
Em Tabela do BigQuery, digite o caminho do BigQuery.
Em Arquivo no Cloud Storage, digite o caminho do Cloud Storage.
Em Saída da previsão em lote, selecione um formato de saída.
Insira o caminho do BigQuery ou o URI do Cloud Storage.
Clique emIniciar avaliação.
Python
Para ver o fluxo de trabalho de avaliação do modelo da API Vertex AI no
Vertex AI Pipelines, consulte os notebooks de exemplo dos seguintes tipos de
modelo:
Para ver o fluxo de trabalho de avaliação do modelo da API Vertex AI no
Vertex AI Pipelines, consulte os notebooks de exemplo dos seguintes tipos de
modelo:
O SDK para avaliar modelos com a Vertex AI está em
modo experimental. Para se inscrever no teste experimental, preencha o
formulário de integração.
Comparar métricas de avaliação
É possível comparar os resultados da avaliação em diferentes modelos, versões de modelos e
jobs de avaliação. Para mais informações sobre o controle de versões de modelos, consulte Controle de versões no
registro de modelos.
Só é possível comparar modelos do mesmo tipo, como de classificação, regressão ou previsão. Ao comparar modelos diferentes, todas
as versões de modelo precisam ser do mesmo tipo.
Só é possível comparar cinco ou menos avaliações por vez.
Acesse o Vertex AI Model Registry no console Google Cloud :
[[["Fácil de entender","easyToUnderstand","thumb-up"],["Meu problema foi resolvido","solvedMyProblem","thumb-up"],["Outro","otherUp","thumb-up"]],[["Difícil de entender","hardToUnderstand","thumb-down"],["Informações incorretas ou exemplo de código","incorrectInformationOrSampleCode","thumb-down"],["Não contém as informações/amostras de que eu preciso","missingTheInformationSamplesINeed","thumb-down"],["Problema na tradução","translationIssue","thumb-down"],["Outro","otherDown","thumb-down"]],["Última atualização 2025-09-04 UTC."],[],[],null,["# Evaluate models using Vertex AI\n\nThis page describes how to evaluate models using Vertex AI. For\nan overview, see [model evaluation in Vertex AI](/vertex-ai/docs/evaluation/introduction).\n\nPrerequisites\n-------------\n\n1. Follow the steps at [Set up a project and a development environment](/vertex-ai/docs/start/cloud-environment).\n In addition, enable the following services:\n\n - [Compute Engine API](https://console.cloud.google.com/flows/enableapi?apiid=compute.googleapis.com)\n - [Dataflow API](https://console.cloud.google.com/flows/enableapi?apiid=dataflow.googleapis.com)\n2. Vertex AI can evaluate models that are trained either\n through AutoML or custom training. For the Google Cloud console\n guide, you should have a trained model [imported to\n Vertex AI Model Registry](/vertex-ai/docs/model-registry/import-model).\n\n3. Upload your test dataset to [BigQuery](/bigquery/docs/loading-data) or\n [Cloud Storage](/storage/docs/uploading-objects). The test dataset should contain the ground\n truth, which is the actual result expected for an inference. Obtain the link\n to the file or the dataset ID.\n\n4. Have a [batch inference output](/vertex-ai/docs/predictions/batch-predictions) in the form of a\n BigQuery table or Cloud Storage URI.\n\n5. Make sure your [default Compute Engine service account](/iam/docs/service-account-types#default) has the\n following [IAM permissions](/vertex-ai/docs/general/iam-permissions):\n\n - Vertex AI Administrator (`aiplatform.admin`)\n - Vertex AI Service Agent (`aiplatform.serviceAgent`)\n - Storage Object Admin (`storage.objectAdmin`)\n - Dataflow Worker (`dataflow.worker`)\n - BigQuery Data Editor (`bigquery.dataEditor`) (only required if you are providing data in the form of BigQuery tables)\n\nCreate an evaluation\n--------------------\n\n### Console\n\n1. In the Google Cloud console, go to the Vertex AI Models page.\n\n [Go to the Models page](https://console.cloud.google.com/vertex-ai/models)\n2. Click the name of the model you want to evaluate.\n\n3. Click the version number for the model.\n\n4. On the **Evaluate** tab, click **Create Evaluation**.\n\n5. Enter an **Evaluation name**.\n\n6. Select an **Objective**, such as classification or regression.\n\n7. Enter the **Evaluation target column name**, which is the column from the\n training data that the model is trained to predict.\n\n8. For **Select source**, select the source for your test dataset.\n\n 1. For **BigQuery table** , enter the **BigQuery path**.\n\n 2. For **File on Cloud Storage** , enter the **Cloud Storage path**.\n\n9. For **Batch prediction output**, select an output format.\n\n 1. Enter the BigQuery path or Cloud Storage URI.\n10. Click **Start Evaluation**.\n\n### Python\n\nTo view the Vertex AI API model evaluation workflow in\nVertex AI Pipelines, see the example notebooks for the following model\ntypes:\n\n- [AutoML tabular classification](https://colab.sandbox.google.com/github/GoogleCloudPlatform/vertex-ai-samples/blob/main/notebooks/official/model_evaluation/automl_tabular_classification_model_evaluation.ipynb)\n\n- [AutoML tabular regression](https://colab.sandbox.google.com/github/GoogleCloudPlatform/vertex-ai-samples/blob/main/notebooks/official/model_evaluation/automl_tabular_regression_model_evaluation.ipynb)\n\n- [AutoML video classification](https://colab.sandbox.google.com/github/GoogleCloudPlatform/vertex-ai-samples/blob/main/notebooks/official/model_evaluation/automl_video_classification_model_evaluation.ipynb)\n\n- [Custom tabular classification](https://colab.sandbox.google.com/github/GoogleCloudPlatform/vertex-ai-samples/blob/main/notebooks/official/model_evaluation/custom_tabular_classification_model_evaluation.ipynb)\n\n- [Custom tabular regression](https://colab.sandbox.google.com/github/GoogleCloudPlatform/vertex-ai-samples/blob/main/notebooks/official/model_evaluation/custom_tabular_regression_model_evaluation.ipynb)\n\n### Python SDK\n\nThe SDK for evaluating models with Vertex AI is in\nExperimental. To sign up for the Experimental, fill out the\n[onboarding form](https://docs.google.com/forms/d/159DJxDx8cQpsjwsNkS7j-qCwsz2uTDVwVQPv4ZfWM50/viewform?edit_requested=true).\n\nVertex AI automatically sends an email notification when\na model evaluation job is complete.\n\nView evaluation metrics\n-----------------------\n\n**Note:** For [BigQuery ML models](/bigquery/docs/model_eval) that are registered to Model Registry, Vertex AI only shows evaluation metrics for regression and binary classification models. \n\n### Console\n\n1. In the Google Cloud console, go to the Vertex AI Models page.\n\n [Go to the Models page](https://console.cloud.google.com/vertex-ai/models)\n2. Navigate to the model version.\n\n3. View metrics in the **Evaluate** tab.\n\n### Python\n\nTo view the Vertex AI API model evaluation workflow in\nVertex AI Pipelines, see the example notebooks for the following model\ntypes:\n\n- [AutoML tabular classification](https://colab.sandbox.google.com/github/GoogleCloudPlatform/vertex-ai-samples/blob/main/notebooks/official/model_evaluation/automl_tabular_classification_model_evaluation.ipynb)\n\n- [AutoML tabular regression](https://colab.sandbox.google.com/github/GoogleCloudPlatform/vertex-ai-samples/blob/main/notebooks/official/model_evaluation/automl_tabular_regression_model_evaluation.ipynb)\n\n- [AutoML video classification](https://colab.sandbox.google.com/github/GoogleCloudPlatform/vertex-ai-samples/blob/main/notebooks/official/model_evaluation/automl_video_classification_model_evaluation.ipynb)\n\n- [Custom tabular classification](https://colab.sandbox.google.com/github/GoogleCloudPlatform/vertex-ai-samples/blob/main/notebooks/official/model_evaluation/custom_tabular_classification_model_evaluation.ipynb)\n\n- [Custom tabular regression](https://colab.sandbox.google.com/github/GoogleCloudPlatform/vertex-ai-samples/blob/main/notebooks/official/model_evaluation/custom_tabular_regression_model_evaluation.ipynb)\n\n### Python SDK\n\nThe SDK for evaluating models with Vertex AI is in\nExperimental. To sign up for the Experimental, fill out the\n[onboarding form](https://docs.google.com/forms/d/159DJxDx8cQpsjwsNkS7j-qCwsz2uTDVwVQPv4ZfWM50/viewform?edit_requested=true).\n\nCompare evaluation metrics\n--------------------------\n\nYou can compare evaluation results across different models, model versions, and\nevaluation jobs. For more information about model versioning, see [Versioning in\nModel Registry](/vertex-ai/docs/model-registry/versioning).\n\nYou can only compare models of the same type, such as classification,\nregression, or forecasting. When comparing different models, all the\nmodel versions must be the same type.\n\nYou can only compare 5 or fewer evaluations at a time.\n\n1. Go to the Vertex AI Model Registry in the Google Cloud console:\n\n [Go to the Models page](https://console.cloud.google.com/vertex-ai/models)\n2. Navigate to your model or model version:\n\n - To compare across different models on the **Models** page, select the\n checkboxes next to the names of the models you want to compare.\n\n - To compare across different model versions:\n\n 1. Click the name of your model on the **Models** page to open the list\n of model versions.\n\n 2. Select the checkboxes next to the versions you want to compare.\n\n - To compare across evaluation jobs for the same model version:\n\n 1. Click the name of your model on the **Models** page to open the list\n of model versions.\n\n 2. Click the version number.\n\n 3. Select the checkboxes next to the evaluation jobs you want to compare.\n\n3. Click **Compare**.\n\nWhat's next\n-----------\n\n- Learn how to [iterate on your model](/vertex-ai/docs/training/evaluating-automl-models#iterate)."]]