Install and configure the Vertex AI SDK for ABAP

This document describes how to install and configure the Vertex AI SDK for ABAP in your SAP environment.

Installation

When you install the version 1.8 of the on-premises or any cloud edition of ABAP SDK for Google Cloud, the Vertex AI SDK for ABAP is installed for you. For information about the installation steps, see Install and configure the on-premises or any cloud edition of ABAP SDK for Google Cloud.

If you're already using version 1.7 or earlier of the on-premises or any cloud edition of ABAP SDK for Google Cloud, then update your SDK to the latest version to get the Vertex AI SDK for ABAP. For more information, see Update ABAP SDK for Google Cloud.

We understand that access to Vertex AI and cloud resources might be limited for some developers. To enable prototyping and experimentation with minimal setup, see Quick prototyping with Gemini.

Enable the Vertex AI API

Enable the Vertex AI API in your Google Cloud project.

Vertex AI API

For information about how to enable Google Cloud APIs, see Enabling APIs.

Authentication

Once you set up authentication to access Google Cloud APIs in your on-premises or any cloud edition of ABAP SDK for Google Cloud, the Vertex AI SDK for ABAP utilizes the same authentication method to access the Vertex AI API. For information about how to set up authentication in the on-premises or any cloud edition of ABAP SDK for Google Cloud, see Authentication overview.

Make a note of the client key that you've created as part of the authentication setup. You use this client key when configuring AI model generation parameters and search parameters.

IAM permissions

Ensure that the dedicated service account for API access that you've configured in the client key table has access to the Vertex AI resources.

Vertex AI

To use the Vertex AI resources, you must grant the Vertex AI User (roles/aiplatform.user) role to the dedicated service account to which you have granted permissions to access the Vertex AI API.

If you need to provide specific permissions to create, modify, deploy artifacts, then grant specific Vertex AI IAM permissions as appropriate.

Vertex AI Feature Store

To use the Vertex AI Feature Store, you must grant the following roles to the service account:

AI capability Required IAM roles
Vertex AI Feature Store

Configure the model generation parameters

Large language models (LLMs) are deep learning models trained on massive amounts of text data. A model includes parameter values that control how the model generates a response. The model can generate different results for different parameter values.

To define the generation parameters for a model, the Vertex AI SDK for ABAP uses the table /GOOG/AI_CONFIG.

To update the table /GOOG/AI_CONFIG, perform the following steps:

  1. In SAP GUI, execute the transaction code /GOOG/SDK_IMG.

    Alternatively, execute the transaction code SPRO, and then click SAP Reference IMG.

  2. Click ABAP SDK for Google Cloud > Basic Settings > Vertex AI SDK: Configure Model Generation Parameters.

  3. Click New Entries.

  4. Enter values for the following fields:

    Field Data type Description
    Model Key String

    A unique name that you specify to identify the model configuration, such as Gemini.

    You use this model key while instantiating the generative model class or the embeddings class to specify the generation configuration to take effect.

    Model ID String

    Model ID of the LLM, such as gemini-1.5-flash-001.

    For information about Vertex AI model versions, see Model versions and lifecycle.

    Google Cloud Key Name String The client key that you've configured for authentication to Google Cloud during the authentication setup.
    Google Cloud Region Location ID String

    The location ID of the Google Cloud region where the Vertex AI features that you want to use are available.

    Typically, you use the region closest to your physical location or the physical location of your intended users. For more information, see Vertex AI locations.

    Publisher ID of the LLM String Optional. The publisher of the LLM, such as google.
    Response MIME type String Optional. Output response MIME type of the generated candidate text. Supported MIME type:
    • text/plain: (default) Text output.
    • application/json: JSON response in the candidates.
    The model needs to be prompted to output the appropriate response type, otherwise the behavior is undefined.
    Randomness temperature String

    Optional. Controls the randomness of predictions. For more information, see Temperature.

    Range: [0.0, 1.0]

    Top-K Sampling Float

    Optional. Top-K changes how the model selects tokens for output.

    Specify a lower value for less random responses and a higher value for more random responses. For more information, see Top-K.

    Range: [1, 40]

    Top-P Sampling Float

    Optional. Top-P changes how the model selects tokens for output.

    Specify a lower value for less random responses and a higher value for more random responses. For more information, see Top-P.

    Range: [0.0, 1.0]

    Maximum number of output tokens per msg Integer

    Optional. Maximum number of tokens that can be generated in the response. A token is approximately four characters. 100 tokens correspond to roughly 60-80 words.

    Specify a lower value for shorter responses and a higher value for potentially longer responses.

    Positive Penalties Float

    Optional. Positive values penalize tokens that have appeared in the generated text, thus increasing the possibility of generating more diverse topics.

    Range: [-2.0, 2.0]

    Frequency Penalties Float

    Optional. Positive values penalize tokens that repeatedly appear in the generated text, thus decreasing the possibility of repeating the same content.

    Range: [-2.0, 2.0]

    If you don't provide a value for an optional parameter, then the SDK uses the default value of the parameter specific to the model version configured in Model ID.

  5. Save the new entry.

Configure the Vector Search parameters

To define Vector Search configurations, the Vertex AI SDK for ABAP uses the table /GOOG/SEARCHCONF.

To update the table /GOOG/SEARCHCONF, perform the following steps:

  1. In SAP GUI, execute the transaction code /GOOG/SDK_IMG.

    Alternatively, execute the transaction code SPRO, and then click SAP Reference IMG.

  2. Click ABAP SDK for Google Cloud > Basic Settings > Vertex AI SDK: Configure Vector Search Parameters.

  3. Click New Entries.

  4. Enter values for the following fields:

    Field Data type Description
    Search Key String A unique name that you specify to identify the search configuration.
    Google Cloud Key Name String The client key that you've configured for authentication to Google Cloud during the authentication setup.
    Google Cloud Region Location ID String

    The location ID of the Google Cloud region where the Vertex AI features that you want to use are available.

    Typically, you use the region closest to your physical location or the physical location of your intended users. For more information, see Vertex AI locations.

    Deployment ID of Vector Index String The deployment ID of an index. When you deploy an index to an endpoint, you assign it a unique deployment ID.

    For information about index deployment, see Deploy a vector index to an index endpoint.

    Vector Index Endpoint ID String

    The ID of the index endpoint to which the index is deployed to.

    For information about index endpoint, see Create a vector index endpoint.

  5. Save the new entry.

What's next