Gemini for AutoML text users

This page provides comparisons between AutoML text and Gemini to help AutoML text users understand how to use Gemini.

Check the tables that apply to your use case, and review the changes that are likely to impact your workflow.

General usage

These differences are applicable to all Gemini users.

Operation AutoML text Gemini
Training data formats You can use CSV or JSON Lines files, except for text entity extraction, to include inline text snippets or to reference documents that are of type TXT. Entity extraction supports only JSON Lines files. You can only use JSON Line files. Each line in the file should represent a single training example. You can download a sample dataset for fine-tuning Gemini models. Files should be stored in Cloud Storage.
Dataset annotation Annotations are grouped together as an AnnotationSet object. You can use different annotation sets with the same dataset. Dataset annotations are not applicable with Gemini.
Dataset import You specify ML use values in an optional column for CSV, in the same row as the data; or as a tag in JSON Lines in the same JSON object as the data. If you don't specify ML use values, your data is split automatically for training, testing, and validation.
For sentiment analysis, CSV files must include the sentiment max value in the last column of each row.
You need to have two separate JSONL files, one for training and one for validation. The validation file is optional. The validation file should have 10-256 examples.
Storage costs When you create a dataset, your data is loaded into Cloud Storage in your project. You are charged for this storage. Learn more. When you create a dataset, your data is loaded into Cloud Storage in your project. You are charged for this storage. Learn more
Data labeling You provide labeling instructions by using a URL. Annotations are part of the Dataset object and can't be manipulated by using the API. Data labeling is not applicable with Gemini.
Model deployment You create an Endpoint object, which provides resources for serving online predictions. You then deploy the model to the endpoint. To request predictions, you call the predict() method. After fine-tuning Gemini, the model is stored in Vertex AI Model Registry, and an Endpoint is automatically created. Online predictions from the tuned model can be requested using the Python SDK, REST API or the console. You request predictions by first fetching the tuned endpoint and then using the generate_content() method.
Using project number or project ID Both project-number and project-id work in Vertex AI. Gemini uses project-id.
Confidence scores AutoML text supports confidence scores. Gemini doesn't support confidence scores.

API users

For detailed information about the API, see the Vertex AI Generative AI Tuning API reference documentation.

Operation or entity AutoML text Gemini
Model Creation You create a TrainingPipeline object, which returns a training job. You create a Supervised Fine Tuning job which returns the tuning job.
Using the client library There are different API clients for each API resource. You can create a Supervised Fine-tuning Job for Gemini using the Python SDK, REST API or the Console.
Requesting predictions You request predictions by calling the predict() method on the Endpoint resource. You request predictions by first fetching the tuned endpoint and then using the generate_content method.
Online prediction endpoint In the following, replace REGION with the region that your prediction model is in.
REGION-aiplatform.googleapis.com. For example:
us-central1-aiplatform.googleapis.com
In the following, replace TUNING_JOB_REGION with the region where your tuning job runs.
TUNING_JOB_REGION-aiplatform.googleapis.com. For example:
us-central1-aiplatform.googleapis.com
Schema and definition files Some request and response fields are defined in schema and definition files. Data formats are defined using predefined schema files. This enables flexibility for the API and data formats. The request body, model parameters, and response body are the same as with the untuned Gemini models. See sample requests.
Hostname aiplatform.googleapis.com aiplatform.googleapis.com
Regional hostname Required. For example:
us-central1-aiplatform.googleapis.com
Required. For example:
us-central1-aiplatform.googleapis.com

What's next