(::Google::Cloud::AutoML::V1::GcsDestination) — Required. The Google Cloud Storage location where the model is to be
written to. This location may only be set for the following model
formats:
"tflite", "edgetpu_tflite", "tf_saved_model", "tf_js", "core_ml".
Under the directory given as the destination a new one with name
"model-export-
value (::Google::Cloud::AutoML::V1::GcsDestination) — Required. The Google Cloud Storage location where the model is to be
written to. This location may only be set for the following model
formats:
"tflite", "edgetpu_tflite", "tf_saved_model", "tf_js", "core_ml".
Under the directory given as the destination a new one with name
"model-export-
Returns
(::Google::Cloud::AutoML::V1::GcsDestination) — Required. The Google Cloud Storage location where the model is to be
written to. This location may only be set for the following model
formats:
"tflite", "edgetpu_tflite", "tf_saved_model", "tf_js", "core_ml".
Under the directory given as the destination a new one with name
"model-export-
#model_format
defmodel_format()->::String
Returns
(::String) —
The format in which the model must be exported. The available, and default,
formats depend on the problem and model type (if given problem and type
combination doesn't have a format listed, it means its models are not
exportable):
tf_saved_model - A tensorflow model in SavedModel format.
tf_js - A TensorFlow.js model that can
be used in the browser and in Node.js using JavaScript.
docker - Used for Docker containers. Use the params field to customize
the container. The container is verified to work correctly on
ubuntu 16.04 operating system. See more at
containers
quickstart
core_ml - Used for iOS mobile devices.
#model_format=
defmodel_format=(value)->::String
Parameter
value (::String) —
The format in which the model must be exported. The available, and default,
formats depend on the problem and model type (if given problem and type
combination doesn't have a format listed, it means its models are not
exportable):
tf_saved_model - A tensorflow model in SavedModel format.
tf_js - A TensorFlow.js model that can
be used in the browser and in Node.js using JavaScript.
docker - Used for Docker containers. Use the params field to customize
the container. The container is verified to work correctly on
ubuntu 16.04 operating system. See more at
containers
quickstart
core_ml - Used for iOS mobile devices.
Returns
(::String) —
The format in which the model must be exported. The available, and default,
formats depend on the problem and model type (if given problem and type
combination doesn't have a format listed, it means its models are not
exportable):
tf_saved_model - A tensorflow model in SavedModel format.
tf_js - A TensorFlow.js model that can
be used in the browser and in Node.js using JavaScript.
docker - Used for Docker containers. Use the params field to customize
the container. The container is verified to work correctly on
ubuntu 16.04 operating system. See more at
containers
quickstart
Additional model-type and format specific parameters describing the
requirements for the to be exported model files, any string must be up to
25000 characters long.
value (::Google::Protobuf::Map{::String => ::String}) —
Additional model-type and format specific parameters describing the
requirements for the to be exported model files, any string must be up to
25000 characters long.
Additional model-type and format specific parameters describing the
requirements for the to be exported model files, any string must be up to
25000 characters long.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-28 UTC."],[],[],null,["# Cloud AutoML V1 API - Class Google::Cloud::AutoML::V1::ModelExportOutputConfig (v1.3.1)\n\nVersion latestkeyboard_arrow_down\n\n- [1.3.1 (latest)](/ruby/docs/reference/google-cloud-automl-v1/latest/Google-Cloud-AutoML-V1-ModelExportOutputConfig)\n- [1.3.0](/ruby/docs/reference/google-cloud-automl-v1/1.3.0/Google-Cloud-AutoML-V1-ModelExportOutputConfig)\n- [1.2.1](/ruby/docs/reference/google-cloud-automl-v1/1.2.1/Google-Cloud-AutoML-V1-ModelExportOutputConfig)\n- [1.1.0](/ruby/docs/reference/google-cloud-automl-v1/1.1.0/Google-Cloud-AutoML-V1-ModelExportOutputConfig)\n- [1.0.1](/ruby/docs/reference/google-cloud-automl-v1/1.0.1/Google-Cloud-AutoML-V1-ModelExportOutputConfig)\n- [0.10.0](/ruby/docs/reference/google-cloud-automl-v1/0.10.0/Google-Cloud-AutoML-V1-ModelExportOutputConfig)\n- [0.9.2](/ruby/docs/reference/google-cloud-automl-v1/0.9.2/Google-Cloud-AutoML-V1-ModelExportOutputConfig)\n- [0.8.0](/ruby/docs/reference/google-cloud-automl-v1/0.8.0/Google-Cloud-AutoML-V1-ModelExportOutputConfig)\n- [0.7.0](/ruby/docs/reference/google-cloud-automl-v1/0.7.0/Google-Cloud-AutoML-V1-ModelExportOutputConfig)\n- [0.6.0](/ruby/docs/reference/google-cloud-automl-v1/0.6.0/Google-Cloud-AutoML-V1-ModelExportOutputConfig)\n- [0.5.1](/ruby/docs/reference/google-cloud-automl-v1/0.5.1/Google-Cloud-AutoML-V1-ModelExportOutputConfig)\n- [0.4.8](/ruby/docs/reference/google-cloud-automl-v1/0.4.8/Google-Cloud-AutoML-V1-ModelExportOutputConfig) \nReference documentation and code samples for the Cloud AutoML V1 API class Google::Cloud::AutoML::V1::ModelExportOutputConfig.\n\nOutput configuration for ModelExport Action. \n\nInherits\n--------\n\n- Object \n\nExtended By\n-----------\n\n- Google::Protobuf::MessageExts::ClassMethods \n\nIncludes\n--------\n\n- Google::Protobuf::MessageExts\n\nMethods\n-------\n\n### #gcs_destination\n\n def gcs_destination() -\u003e ::Google::Cloud::AutoML::V1::GcsDestination\n\n**Returns**\n\n- ([::Google::Cloud::AutoML::V1::GcsDestination](./Google-Cloud-AutoML-V1-GcsDestination)) --- Required. The Google Cloud Storage location where the model is to be written to. This location may only be set for the following model formats: \"tflite\", \"edgetpu_tflite\", \"tf_saved_model\", \"tf_js\", \"core_ml\".\n\n\n Under the directory given as the destination a new one with name\n \"model-export-\n\n### #gcs_destination=\n\n def gcs_destination=(value) -\u003e ::Google::Cloud::AutoML::V1::GcsDestination\n\n**Parameter**\n\n- **value** ([::Google::Cloud::AutoML::V1::GcsDestination](./Google-Cloud-AutoML-V1-GcsDestination)) --- Required. The Google Cloud Storage location where the model is to be written to. This location may only be set for the following model formats: \"tflite\", \"edgetpu_tflite\", \"tf_saved_model\", \"tf_js\", \"core_ml\".\n\n\n Under the directory given as the destination a new one with name\n\"model-export- \n**Returns**\n\n- ([::Google::Cloud::AutoML::V1::GcsDestination](./Google-Cloud-AutoML-V1-GcsDestination)) --- Required. The Google Cloud Storage location where the model is to be written to. This location may only be set for the following model formats: \"tflite\", \"edgetpu_tflite\", \"tf_saved_model\", \"tf_js\", \"core_ml\".\n\n\n Under the directory given as the destination a new one with name\n \"model-export-\n\n### #model_format\n\n def model_format() -\u003e ::String\n\n**Returns**\n\n- (::String) --- The format in which the model must be exported. The available, and default,\n formats depend on the problem and model type (if given problem and type\n combination doesn't have a format listed, it means its models are not\n exportable):\n\n - For Image Classification mobile-low-latency-1, mobile-versatile-1,\n mobile-high-accuracy-1:\n \"tflite\" (default), \"edgetpu_tflite\", \"tf_saved_model\", \"tf_js\",\n \"docker\".\n\n - For Image Classification mobile-core-ml-low-latency-1,\n mobile-core-ml-versatile-1, mobile-core-ml-high-accuracy-1:\n \"core_ml\" (default).\n\n - For Image Object Detection mobile-low-latency-1, mobile-versatile-1,\n mobile-high-accuracy-1:\n \"tflite\", \"tf_saved_model\", \"tf_js\".\n Formats description:\n\n - tflite - Used for Android mobile devices.\n\n - edgetpu_tflite - Used for [Edge TPU](https://cloud.google.com/edge-tpu/)\n devices.\n\n - tf_saved_model - A tensorflow model in SavedModel format.\n\n - tf_js - A [TensorFlow.js](https://www.tensorflow.org/js) model that can\n be used in the browser and in Node.js using JavaScript.\n\n - docker - Used for Docker containers. Use the params field to customize\n the container. The container is verified to work correctly on\n ubuntu 16.04 operating system. See more at\n [containers\n quickstart](https://cloud.google.com/vision/automl/docs/containers-gcs-quickstart)\n\n - core_ml - Used for iOS mobile devices.\n\n### #model_format=\n\n def model_format=(value) -\u003e ::String\n\n**Parameter**\n\n- **value** (::String) ---\n\n The format in which the model must be exported. The available, and default,\n formats depend on the problem and model type (if given problem and type\n combination doesn't have a format listed, it means its models are not\n exportable):\n - For Image Classification mobile-low-latency-1, mobile-versatile-1,\n mobile-high-accuracy-1:\n \"tflite\" (default), \"edgetpu_tflite\", \"tf_saved_model\", \"tf_js\",\n \"docker\".\n\n - For Image Classification mobile-core-ml-low-latency-1,\n mobile-core-ml-versatile-1, mobile-core-ml-high-accuracy-1:\n \"core_ml\" (default).\n\n - For Image Object Detection mobile-low-latency-1, mobile-versatile-1,\n mobile-high-accuracy-1:\n \"tflite\", \"tf_saved_model\", \"tf_js\".\n Formats description:\n\n - tflite - Used for Android mobile devices.\n\n - edgetpu_tflite - Used for [Edge TPU](https://cloud.google.com/edge-tpu/)\n devices.\n\n - tf_saved_model - A tensorflow model in SavedModel format.\n\n - tf_js - A [TensorFlow.js](https://www.tensorflow.org/js) model that can\n be used in the browser and in Node.js using JavaScript.\n\n - docker - Used for Docker containers. Use the params field to customize\n the container. The container is verified to work correctly on\n ubuntu 16.04 operating system. See more at\n [containers\n quickstart](https://cloud.google.com/vision/automl/docs/containers-gcs-quickstart)\n\n - core_ml - Used for iOS mobile devices.\n\n**Returns**\n\n- (::String) --- The format in which the model must be exported. The available, and default,\n formats depend on the problem and model type (if given problem and type\n combination doesn't have a format listed, it means its models are not\n exportable):\n\n - For Image Classification mobile-low-latency-1, mobile-versatile-1,\n mobile-high-accuracy-1:\n \"tflite\" (default), \"edgetpu_tflite\", \"tf_saved_model\", \"tf_js\",\n \"docker\".\n\n - For Image Classification mobile-core-ml-low-latency-1,\n mobile-core-ml-versatile-1, mobile-core-ml-high-accuracy-1:\n \"core_ml\" (default).\n\n - For Image Object Detection mobile-low-latency-1, mobile-versatile-1,\n mobile-high-accuracy-1:\n \"tflite\", \"tf_saved_model\", \"tf_js\".\n Formats description:\n\n - tflite - Used for Android mobile devices.\n\n - edgetpu_tflite - Used for [Edge TPU](https://cloud.google.com/edge-tpu/)\n devices.\n\n - tf_saved_model - A tensorflow model in SavedModel format.\n\n - tf_js - A [TensorFlow.js](https://www.tensorflow.org/js) model that can\n be used in the browser and in Node.js using JavaScript.\n\n - docker - Used for Docker containers. Use the params field to customize\n the container. The container is verified to work correctly on\n ubuntu 16.04 operating system. See more at\n [containers\n quickstart](https://cloud.google.com/vision/automl/docs/containers-gcs-quickstart)\n\n - core_ml - Used for iOS mobile devices.\n\n### #params\n\n def params() -\u003e ::Google::Protobuf::Map{::String =\u003e ::String}\n\n**Returns**\n\n- (::Google::Protobuf::Map{::String =\\\u003e ::String}) --- Additional model-type and format specific parameters describing the\n requirements for the to be exported model files, any string must be up to\n 25000 characters long.\n\n - For `docker` format: `cpu_architecture` - (string) \"x86_64\" (default). `gpu_architecture` - (string) \"none\" (default), \"nvidia\".\n\n### #params=\n\n def params=(value) -\u003e ::Google::Protobuf::Map{::String =\u003e ::String}\n\n**Parameter**\n\n- **value** (::Google::Protobuf::Map{::String =\\\u003e ::String}) ---\n\n Additional model-type and format specific parameters describing the\n requirements for the to be exported model files, any string must be up to\n 25000 characters long.\n- For `docker` format: `cpu_architecture` - (string) \"x86_64\" (default). `gpu_architecture` - (string) \"none\" (default), \"nvidia\". \n**Returns**\n\n- (::Google::Protobuf::Map{::String =\\\u003e ::String}) --- Additional model-type and format specific parameters describing the\n requirements for the to be exported model files, any string must be up to\n 25000 characters long.\n\n - For `docker` format: `cpu_architecture` - (string) \"x86_64\" (default). `gpu_architecture` - (string) \"none\" (default), \"nvidia\"."]]