Utilizing pretrained LLM

This page describes how to utlize a textual large-language model m(LLM).

New LLM textual features

While it is possible to get text embeddings by manually configuring a Vertex AI generative model, you may want to more specifically cover your retailer use case. For this, you can use pretrained Vertex AI generative models on retailer metadata to improve performance in the recommender models.

The text embeddings are more descriptive, longer, and are not repetitive, as well as have multilingual interpretation capabilities. This feature is based on an allowlist. Contact support for enabling this feature.

There is no charge for using the text embeddings and they are included in Vertex AI Search pricing.

The LLM-pretrained embeddings improve semantic understanding of long form text searches such as descriptions.

Model compatibility

The LLM feature is compatible with all ml model types and objectives, including:

  • OYML
  • FBT
  • and more.