Stay organized with collections
Save and categorize content based on your preferences.
This page describes the process to delete an online prediction model and all the
resources associated with it.
Before you begin
To get the permissions that you need to access Online Prediction,
ask your Project IAM Admin to grant you the Vertex AI
Prediction User (vertex-ai-prediction-user) role.
Additionally, to get the permissions that you need to delete objects in a
bucket, ask your Project IAM Admin to grant you the Project Bucket Object Admin
(project-bucket-object-admin) role in the project.
Delete resources
If you want to delete an online prediction model and all the resources
associated with it, perform the following steps:
Delete the DeployedModel custom resource associated with your model
on the prediction cluster:
Replace ENDPOINT_NAME with the name of the
Endpoint definition file.
On the YAML file, manually delete the serviceRef object containing
the DeployedModel reference you deleted previously.
Save the changes on the YAML file.
Delete your model from the storage bucket. For more information about how to
delete objects from storage buckets, see Delete storage objects in projects.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-29 UTC."],[[["\u003cp\u003eOnline Prediction is a Preview feature not recommended for production environments and lacks service-level agreements or technical support.\u003c/p\u003e\n"],["\u003cp\u003eDeleting an online prediction model involves removing the associated \u003ccode\u003eDeployedModel\u003c/code\u003e custom resource from the prediction cluster using \u003ccode\u003ekubectl\u003c/code\u003e.\u003c/p\u003e\n"],["\u003cp\u003eDepending on whether the \u003ccode\u003eEndpoint\u003c/code\u003e hosts other models, you must either delete the entire \u003ccode\u003eEndpoint\u003c/code\u003e custom resource or edit it to remove the deleted \u003ccode\u003eDeployedModel\u003c/code\u003e's \u003ccode\u003eserviceRef\u003c/code\u003e.\u003c/p\u003e\n"],["\u003cp\u003eAfter removing the \u003ccode\u003eDeployedModel\u003c/code\u003e and adjusting the \u003ccode\u003eEndpoint\u003c/code\u003e, the final step is to delete the model from its storage bucket.\u003c/p\u003e\n"]]],[],null,["# Delete an online prediction model\n\n| **Preview:** Online Prediction is a Preview feature that is available as-is and is not recommended for production environments. Google provides no service-level agreements (SLA) or technical support commitments for Preview features. For more information, see GDC's [feature stages](/distributed-cloud/hosted/docs/latest/gdch/resources/feature-stages).\n\nThis page describes the process to delete an online prediction model and all the\nresources associated with it.\n\nBefore you begin\n----------------\n\nTo get the permissions that you need to access Online Prediction,\nask your Project IAM Admin to grant you the Vertex AI\nPrediction User (`vertex-ai-prediction-user`) role.\n\nFor information about this role, see\n[Prepare IAM permissions](/distributed-cloud/hosted/docs/latest/gdch/application/ao-user/vertex-ai-ao-permissions).\n\nAdditionally, to get the permissions that you need to delete objects in a\nbucket, ask your Project IAM Admin to grant you the Project Bucket Object Admin\n(`project-bucket-object-admin`) role in the project.\n\nDelete resources\n----------------\n\nIf you want to delete an online prediction model and all the resources\nassociated with it, perform the following steps:\n\n1. Delete the `DeployedModel` custom resource associated with your model\n on the prediction cluster:\n\n kubectl --kubeconfig \u003cvar translate=\"no\"\u003ePREDICTION_CLUSTER_KUBECONFIG\u003c/var\u003e delete -f \u003cvar translate=\"no\"\u003eDEPLOYED_MODEL_NAME\u003c/var\u003e.yaml\n\n Replace the following:\n - \u003cvar translate=\"no\"\u003ePREDICTION_CLUSTER_KUBECONFIG\u003c/var\u003e: the path to the kubeconfig file in the prediction cluster.\n - \u003cvar translate=\"no\"\u003eDEPLOYED_MODEL_NAME\u003c/var\u003e: the name of the `DeployedModel` definition file.\n2. Edit the `Endpoint` custom resource in one of the following ways:\n\n - If the endpoint that the `DeployedModel` uses doesn't host other models,\n delete the `Endpoint` custom resource on the prediction cluster:\n\n kubectl --kubeconfig \u003cvar translate=\"no\"\u003ePREDICTION_CLUSTER_KUBECONFIG\u003c/var\u003e delete -f \u003cvar translate=\"no\"\u003eENDPOINT_NAME\u003c/var\u003e.yaml\n\n Replace \u003cvar translate=\"no\"\u003eENDPOINT_NAME\u003c/var\u003e with the name of the\n `Endpoint` definition file.\n - If the endpoint that the `DeployedModel` uses hosts other models,\n perform the following steps:\n\n 1. Update the `Endpoint` custom resource on the prediction cluster:\n\n kubectl --kubeconfig \u003cvar translate=\"no\"\u003ePREDICTION_CLUSTER_KUBECONFIG\u003c/var\u003e edit -f \u003cvar translate=\"no\"\u003eENDPOINT_NAME\u003c/var\u003e.yaml\n\n Replace \u003cvar translate=\"no\"\u003eENDPOINT_NAME\u003c/var\u003e with the name of the\n `Endpoint` definition file.\n 2. On the YAML file, manually delete the `serviceRef` object containing\n the `DeployedModel` reference you deleted previously.\n\n 3. Save the changes on the YAML file.\n\n3. Delete your model from the storage bucket. For more information about how to\n delete objects from storage buckets, see [Delete storage objects in projects](/distributed-cloud/hosted/docs/latest/gdch/application/ao-user/delete-storage-objects)."]]