Implement Vertex AI Search for retail

You can implement Vertex AI Search for retail for your ecommerce application.

When you use recommendations or search, you ingest user event and catalog data and to serve predictions or search results on your site.

The same data is used for both recommendations and search, so if you use both, you don't need to ingest the same data twice.

If you use recommendations models, User event data requirements lists additional requirements depending on your model type and optimization objective. These requirements help Vertex AI Search for retail generate quality results.

The average integration time is in the order of weeks. Note that for search, the actual duration depends heavily on the quality and quantity of data to ingest.

If you are using Google Tag Manager or Google Merchant Center, you can implement Vertex AI Search for retail with Google tools.

You can get personalized results for your website whether or not you are using additional Google tools. If you are not, see Implement Vertex AI Search for retail without Google tools.

Take implementation steps

If you're using Tag Manager and Merchant Center, follow the steps in the With Google tools tab to integrate Vertex AI Search for retail into your website. If you're not using Tag Manager and Merchant Center, follow the steps in the Without Google tools tab to integrate Vertex AI Search for retail into your website.

With Google tools

Step Description
1. Set up a Google Cloud project You can use an existing Google Cloud project if you have one already. Otherwise, follow this guide to set up a new project.
2a. Import your product catalog using Merchant Center

You can also directly import your product catalog, but linking to Merchant Center reduces the steps needed to import your catalog. This solution is not ideal if you want to use facets. This turnkey solution works well with Google Ads and is replicated quickly into Vertex AI Search for retail. This can be up and running in a few clicks.

Note that Merchant Center does not support the collections product type. Before importing, make sure to review Merchant Center limitations to check if it meets your catalog needs.

2b. Configure Tag Manager to record user events User events track user actions such as clicking on a product, adding an item to a shopping cart, or purchasing an item. You can start recording user events in parallel to the catalog import. After the catalog import is complete, rejoin any events that were uploaded before the import completed. If you are already using Google Tag Manager, this is the recommended method due to integration with Vertex AI Search for retail.
3. Import historical user events

Providing historical user event data lets you to start model training without having to wait months for enough user event data to be collected from your site. To learn how to import user data, refer to the Import user events documentation on importing Google Analytics 360 and GA4 events from BigQuery. Your models need sufficient training data before they can provide accurate predictions. To learn how much data to use, understand the requirements for each model.

Without Google tools

Step Description
1. Set up a Google Cloud project

Create a Google Cloud project and create authentication credentials including an API key and an OAuth token (either using a user account or a service account) to access the project.

2a. Import your product catalog

You can add items to your product catalog individually by using the Products.create method. For large product catalogs, we recommend that you add items in bulk using the Products.import method. This provides more configurability and is a good option for businesses who want to pilot.

2b. Record user events

User events track user actions such as clicking on a product, adding an item to a shopping cart, or purchasing an item. User event data is needed to generate personalized results. User events need to be ingested in real time to accurately reflect the behavior of your users.

You can start recording user events in parallel to the catalog import. After the catalog import is complete, rejoin any events that were uploaded before the import completed. You will need to write a tracking pixel.

3. Import historical user events

Providing historical user event data lets you start model training without having to wait months for enough user event data to be collected from your site. To learn how to import user data, refer to the Import user events documentation on importing events from Cloud Storage, BigQuery or to import events inline using the userEvents.import method. Your models need sufficient training data before they can provide accurate predictions. Then, learn more about the import requirements for each model type.

Follow these steps for both onboarding pathways

Step Description
4. Set up monitoring and alerts

Set up monitoring and alerts.

5. Create your serving config, model, and controls

Decide if you want to use recommendations, search, or both. Then, get familiar with formats for user events. A serving config is an entity that associates a model and, optionally, controls. A serving config is used like a container when generating your search or recommendation results.

If you're using recommendations when you create a serving config, you can simultaneously create a model along with your controls. You can also create these separately. Choose a model type based on the location of your serving config and its objectives. Review the available recommendation types, optimization objectives, and other model tuning options to determine the best options for your business objectives. (For search serving configs, a default model is automatically created.)

6. Allow time for model training and tuning

Serving configs are test versions of configurations. They are used like a work space to test the difference between optimization objectives or controls. You can stage one serving config to test it against the production one, for example, and point the application to one or the other for troubleshooting.

If you are using search, training and tuning is automatic, assuming you have hit the threshold. Refer to the requirements for user events for each model and each product to determine how many and what type of user events to train and tune the models on.

If you are using recommendations, creating a model initiates training and tuning. Initial model training and tuning takes 2-5 days to complete, but can take longer for large datasets. Initial model training and tuning takes 2-5 days to complete, but can take longer for large datasets.

7. Preview and test your serving config

After your model has been activated, preview and test your serving config's recommendations or search results to ensure your setup is functioning as expected. You can create new controls or use existing controls to add new serving configs and point the application to the test version to compare performance. You can exclude or include rules and split test the production versus another test serving config. You can then simulate searches using these variations in the Evaluations page of the console.

8. Set up an A/B experiment (Optional)

You can use an A/B experiment to compare the performance of your website with and without Vertex AI Search for retail.

9. Evaluate your configuration

Assess the metrics provided by the Search for Retail to help you determine how your business is affected by incorporating Vertex AI Search for retail.

View the metrics for your project on the Analytics page of the Search for Retail console.

Terms of Service

Product usage is under Google Cloud's Terms and Conditions or relevant offline variant. The Google Cloud Privacy Notice explains how we collect and process your personal information relating to the use of Google Cloud and other Google Cloud services.

For quality assurance, a small sample set of search queries and search results from the logs, which include customer data, are sent for human rating to third-party vendors disclosed as Third-Party Subprocessors for search. Additional tests using search queries and search results from Google Search logs that are publicly collected datasets are sent for human rating to different third-party vendors for quality assurance. The Google Search logs are not categorized as customer data.