Check grounding with RAG

As part of your Retrieval Augmented Generation (RAG) experience in Vertex AI Agent Builder, you can check grounding to determine how grounded a piece of text (called an answer candidate) is in a given set of reference texts (called facts).

The check grounding API returns an overall support score of 0 to 1, which indicates how much the answer candidate agrees with the given facts. The response also includes citations to the facts supporting each claim in the answer candidate.

Additionally, you can enable a claim-level support score, as an Experimental feature, to get a support score that indicates the grounding of each claim of the answer candidate.

Perfect grounding requires that every claim in the answer candidate must be supported by one or more of the given facts. In other words, the claim is wholly entailed by the facts. If the claim is only partially entailed, it is not considered grounded. For example, the claim "Google was founded by Larry Page and Sergey Brin in 1975" is only partially correct—the names of the founders are correct but the date is wrong—and as such the whole claim is considered ungrounded. In this version of the check grounding API, a sentence is considered a single claim.

You can also enable anti-citations, as an Experimental feature, to get a contradiction score, which indicates how much the answer candidate contradicts the given facts. The response also include anti-citations to the contradicting facts for each claim.

You can use the check grounding API to check any piece of text. It can be a human-generated blurb or a machine-generated response. A typical use case is to check an LLM-generated response against a given set of facts. The check grounding API is designed to be fast, with latency less than 500ms. This speed allows chat bots to call the check grounding API during each inference, without incurring a significant slowdown. The check grounding API can also provide references to support its findings, so that users can tell which parts of the generated response are reliable. The API also provides a support score to indicate the overall accuracy of the response. By setting a citation threshold, chat bots can filter out responses at inference time that are likely to contain hallucinated claims.

This page describes how to check grounding using the check grounding API.

Before you begin

The check grounding API is a generally available API.

Additionally, there are two features that are Experimental:

  • The anti-citations feature that generates the contradiction score.
  • The claim-level score feature that returns per-claim support score.

To try the Experimental features, contact your Google account team and ask to be added to the allowlist.

Terms defined and explained

Before you use the check grounding API, it helps to understand the inputs and outputs, and how to structure your grounding facts for best results.

Input data

The check grounding API requires the following inputs in the request.

  • Answer candidate: An answer candidate can be any piece of text whose grounding you want to check. For example, in the context of Vertex AI Search, the answer candidate might be the generated search summary that answers a query. The API would then determine how grounded the summary is in the input facts. An answer candidate can have a maximum length of 4096 tokens, where a token is defined as a word in a sentence or a period (a punctuation mark used to end the sentence). For example, the sentence "They wore off-the-rack clothes in 2024." is seven tokens long, including six words and a period.

  • Facts: A set of text segments to be used as references for grounding. A set of metadata attributes (key-value pairs) can be supplied with each text segment. For example, "Author" and "Title" are typical attribute keys.

    The service supports up to 200 facts, each with a maximum of 10k characters.

    Google recommends against supplying one very large fact that contains all of the information. Instead, you can get better results by breaking large facts into smaller facts and supplying appropriate attributes for the smaller facts. For example, you can break up a large fact by title, author, or URL, and supply this information in attributes.

  • Citation threshold: A float value from 0 to 1 that controls the confidence for the citations that support the answer candidate. A higher threshold imposes stricter confidence. Therefore, a higher threshold yields fewer but stronger citations.

Output data

The check grounding API returns the following for an answer candidate:

  • Support score: The support score is a number from 0 to 1 that indicates how grounded an answer candidate is in the provided set of facts. It loosely approximates the fraction of claims in the answer candidate that were found to be grounded in one or more of the given facts.

  • Cited chunks: Cited chunks are portions of the input facts that support the answer candidate.

  • Claims and citations: The claims and citations connect a claim (typically a sentence) of the answer candidate to one or more of the cited chunks that corroborate the claim.

    When the claim-level score is enabled, with each claim, a support score is returned as a number from 0 to 1 that indicates how grounded the claim is in the provided set of facts. For more information, see Obtain claim-level scores for an answer candidate.

  • Grounding check required: With each claim, a grounding-check-required boolean is returned. When this returns as False, it means that the system deems that the claim doesn't require grounding, and, therefore, citations and anti-citations aren't returned. For example, a sentence like "Here is what I found." isn't a fact by itself and, thus, doesn't require a grounding check.

    When the grounding-check-required returns as true, it means that a grounding check was performed and support scores, citations, and anti-citations, if any, are returned.

Obtain a support score for an answer candidate

To find out how grounded an answer candidate is in a set of facts, follow these steps:

  1. Prepare your set of facts. For more information and examples, see Terms defined and explained.

  2. Call the check method using the following code:

REST

curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json" \
-H "X-Goog-User-Project: PROJECT_ID" \
"https://discoveryengine.googleapis.com/v1/projects/PROJECT_ID/locations/global/groundingConfigs/default_grounding_config:check" \
-d '{
  "answerCandidate": "CANDIDATE",
  "facts": [
  {
    "factText": "TEXT_0",
    "attributes": {"ATTRIBUTE_A": "VALUE_A0","ATTRIBUTE_B": "VALUE_B0"}
  },
  {
    "factText": "TEXT_1",
    "attributes": {"ATTRIBUTE_A": "VALUE_A1","ATTRIBUTE_B": "VALUE_B1"}
  },
  {
    "factText": "TEXT_2",
    "attributes": {"ATTRIBUTE_A": "VALUE_A2","ATTRIBUTE_B": "VALUE_B2"}
  }
  ],
  "groundingSpec": {
    "citationThreshold": "CITATION_THRESHOLD"
  }
}'

Replace the following:

  • PROJECT_ID: the project number or ID of your Google Cloud project.

  • CANDIDATE: the answer candidate string for which you want to get a support score—for example, Titanic was directed by James Cameron. It was released in 1997. An answer candidate can have a maximum length of 4096 tokens, where a token is defined as a word in a sentence or a period (a punctuation mark used to end the sentence). For example, the sentence "They wore off-the-rack clothes in 2024." is seven tokens long, including six words and a period.

  • TEXT: the text segment to be used for grounding—for example, Titanic is a 1997 American epic... Academy Awards. (See the full text in Examples of facts.)

  • ATTRIBUTE: the name of a metadata attribute associated with the fact—for example, author or title. This is a user-defined label to add more information to the fact text. For example, if the fact text Toronto is the capital of Ontario has an author attribute with its value as Wikipedia, then the following claims are considered grounded in the fact:

    • Wikipedia cites that Toronto is the capital of Ontario
    • Toronto is the capital of Ontario

      However, the claim that Government of Ontario claims that Toronto is the capital of Ontario is not as grounded as the first two claims.

  • VALUE: the value for the attribute—for example, Simple Wikipedia or Titanic (1997 film).

  • CITATION_THRESHOLD: a float value from 0 through 1 that determines whether a fact must be cited for a claim in the answer candidate. A higher threshold leads to fewer but strong citations and a lower threshold leads to more but weak citations. If unset, the default threshold value is 0.6.

Python

For more information, see the Vertex AI Agent Builder Python API reference documentation.

To authenticate to Vertex AI Agent Builder, set up Application Default Credentials. For more information, see Set up authentication for a local development environment.

from google.cloud import discoveryengine_v1 as discoveryengine

# TODO(developer): Uncomment these variables before running the sample.
# project_id = "YOUR_PROJECT_ID"

client = discoveryengine.GroundedGenerationServiceClient()

# The full resource name of the grounding config.
# Format: projects/{project_id}/locations/{location}/groundingConfigs/default_grounding_config
grounding_config = client.grounding_config_path(
    project=project_id,
    location="global",
    grounding_config="default_grounding_config",
)

request = discoveryengine.CheckGroundingRequest(
    grounding_config=grounding_config,
    answer_candidate="Titanic was directed by James Cameron. It was released in 1997.",
    facts=[
        discoveryengine.GroundingFact(
            fact_text=(
                "Titanic is a 1997 American epic romantic disaster movie. It was directed, written,"
                " and co-produced by James Cameron. The movie is about the 1912 sinking of the"
                " RMS Titanic. It stars Kate Winslet and Leonardo DiCaprio. The movie was released"
                " on December 19, 1997. It received positive critical reviews. The movie won 11 Academy"
                " Awards, and was nominated for fourteen total Academy Awards."
            ),
            attributes={"author": "Simple Wikipedia"},
        ),
        discoveryengine.GroundingFact(
            fact_text=(
                'James Cameron\'s "Titanic" is an epic, action-packed romance'
                "set against the ill-fated maiden voyage of the R.M.S. Titanic;"
                "the pride and joy of the White Star Line and, at the time,"
                "the largest moving object ever built. "
                'She was the most luxurious liner of her era -- the "ship of dreams" -- '
                "which ultimately carried over 1,500 people to their death in the "
                "ice cold waters of the North Atlantic in the early hours of April 15, 1912."
            ),
            attributes={"author": "Simple Wikipedia"},
        ),
    ],
    grounding_spec=discoveryengine.CheckGroundingSpec(citation_threshold=0.6),
)

response = client.check_grounding(request=request)

# Handle the response
print(response)

Examples of facts

The following are a couple of examples of facts and their attributes. These examples are to help you understand the grounding response and the format of the curl command.

  • Fact 0

    • Text: "Titanic is a 1997 American epic romantic disaster movie. It was directed, written, and co-produced by James Cameron. The movie is about the 1912 sinking of the RMS Titanic. It stars Kate Winslet and Leonardo DiCaprio. The movie was released on December 19, 1997. It received positive critical reviews. The movie won 11 Academy Awards, and was nominated for fourteen total Academy Awards."

    • Attributes: {"Author": "Simple Wikipedia"}

  • Fact 1

    • Text: "James Cameron's "Titanic" is an epic, action-packed romance set against the ill-fated maiden voyage of the R.M.S. Titanic; the pride and joy of the White Star Line and, at the time, the largest moving object ever built. She was the most luxurious liner of her era -- the "ship of dreams" -- which ultimately carried over 1,500 people to their death in the ice cold waters of the North Atlantic in the early hours of April 15, 1912."

    • Attributes: {"Author": "Rotten Tomatoes"}

Example request

After preparing the facts, you can send the following request, replacing the CANDIDATE field with different strings whose grounding you want to check.

    curl -X POST \
    -H "Authorization: Bearer $(gcloud auth print-access-token)" \
    -H "Content-Type: application/json" \
    "https://discoveryengine.googleapis.com/v1/projects/PROJECT_ID/locations/global/groundingConfigs/default_grounding_config:check" \
    -d '{
    "answerCandidate": "CANDIDATE",
    "facts": [
     {
      "factText": "Titanic is a 1997 American epic romantic disaster movie. It was directed, written, and co-produced by James Cameron. The movie is about the 1912 sinking of the RMS Titanic. It stars Kate Winslet and Leonardo DiCaprio. The movie was released on December 19, 1997. It received positive critical reviews. The movie won 11 Academy Awards, and was nominated for fourteen total Academy Awards.",
      "attributes": {"author":"Simple Wikipedia"}
     },
     {
      "factText": "James Cameron's \"Titanic\" is an epic, action-packed romance set against the ill-fated maiden voyage of the R.M.S. Titanic; the pride and joy of the White Star Line and, at the time, the largest moving object ever built. She was the most luxurious liner of her era -- the \"ship of dreams\" -- which ultimately carried over 1,500 people to their death in the ice cold waters of the North Atlantic in the early hours of April 15, 1912.",
      "attributes": {"author":"Simple Wikipedia"}
     }
    ],
    "groundingSpec": {
      "citationThreshold": "0.6"
    }
    }'

Examples of answer candidates and grounding responses

The following table shows examples of different answer candidates and responses when you send the example request, based on the example facts.

Answer candidate Check grounding response
Here is what I found. Titanic was directed by James Cameron. Support score: 0.99

Cited chunks:
  1. [From FACT 0]....Titanic is a 1997 American epic romantic disaster movie. It was directed, written, and co-produced by James Cameron. It stars Kate Winslet and Leonardo DiCaprio. The movie was released on December 19, 1997. It received positive critical reviews. The movie won 11 Academy Awards, and was nominated for fourteen total Academy Awards.....
Claims and citations:
  • 0. Here is what I found.
    Grounding check required: false
  • 1. Titanic was directed by James Cameron.
    Citations: [0]
    Grounding check required: true
Titanic was directed by James Cameron. It was released in 1997. Support score: 0.99

Cited chunks:
  1. [From FACT 0]....Titanic is a 1997 American epic romantic disaster movie. It was directed, written, and co-produced by James Cameron. It stars Kate Winslet and Leonardo DiCaprio. The movie was released on December 19, 1997. It received positive critical reviews. The movie won 11 Academy Awards, and was nominated for fourteen total Academy Awards.....
  2. [From FACT 1]...Titanic is an epic, action-packed romance set against the ill-fated maiden voyage of the R.M.S. Titanic; the pride and joy of the White Star Line and, at the time, the largest moving object ever built. She was the most luxurious liner of her era which ultimately carried over 1,500 people to their death in the ice cold waters of the North Atlantic in the early hours of April 15, 1912....
Claims and citations:
  • 0. Titanic was directed by James Cameron.
    Citations: [0]
    Grounding check required: true
  • 1. It was released in 1997.
    Citations: [1]
    Grounding check required: true
Titanic was directed by James Cameron. It was based on the sinking of the RMS Titanic that led to the death of 1500 people. Support score: 0.95

Cited chunks:
  1. [From FACT 0]....Titanic is a 1997 American epic romantic disaster movie. It was directed, written, and co-produced by James Cameron. It stars Kate Winslet and Leonardo DiCaprio. The movie was released on December 19, 1997. It received positive critical reviews. The movie won 11 Academy Awards, and was nominated for fourteen total Academy Awards.....
  2. [From FACT 1]...Titanic is an epic, action-packed romance set against the ill-fated maiden voyage of the R.M.S. Titanic; the pride and joy of the White Star Line and, at the time, the largest moving object ever built. She was the most luxurious liner of her era which ultimately carried over 1,500 people to their death in the ice cold waters of the North Atlantic in the early hours of April 15, 1912....
Claims and citations:
  • 0. Cited chunks: Titanic was directed by James Cameron.
    Citations: [0]
    Grounding check required: true
  • 1. Cited chunks: It was based on the sinking of the RMS Titanic that led to the death of 1500 people.
    Citations: [1]
    Grounding check required: true
Titanic was directed by James Cameron. It starred Brad Pitt and Kate Winslet Support score: 0.54

Cited chunks:
  1. [From FACT 0]....Titanic is a 1997 American epic romantic disaster movie. It was directed, written, and co-produced by James Cameron. It stars Kate Winslet and Leonardo DiCaprio. The movie was released on December 19, 1997. It received positive critical reviews. The movie won 11 Academy Awards, and was nominated for fourteen total Academy Awards.....
  2. [From FACT 1]...Titanic is an epic, action-packed romance set against the ill-fated maiden voyage of the R.M.S. Titanic; the pride and joy of the White Star Line and, at the time, the largest moving object ever built. She was the most luxurious liner of her era which ultimately carried over 1,500 people to their death in the ice cold waters of the North Atlantic in the early hours of April 15, 1912....
Claims and citations:
  • 0. Titanic was directed by James Cameron.
    Citations: [0]
    Grounding check required: true
  • 1. It starred Brad Pitt and Kate Winslet
    Citations: []
    Grounding check required: true
Note: Even though Kate Winslet starred in the movie, because the claim "It starred Brad Pitt and Kate Winslet" is not wholly true, it gets no citations. In this case, you can call the method with anti-citations enabled to give you a contradiction score. For more information, see Obtain a contradiction score for an answer candidate.

Obtain a contradiction score for an answer candidate

Along with the support score, you can also obtain a contradiction score. The contradiction score loosely approximates the fraction of claims that contradict the provided facts.

To try this Experimental feature, contact your Google account team and ask to be added to the allowlist.

Get a contradiction score

To obtain the contradiction score, follow these steps:

  1. Prepare your set of facts. For more information and examples, see Terms defined and explained.

  2. Call the check method, using the following curl command:

    curl -X POST \
    -H "Authorization: Bearer $(gcloud auth print-access-token)" \
    -H "Content-Type: application/json" \
    "https://discoveryengine.googleapis.com/v1alpha/projects/PROJECT_ID/locations/global/groundingConfigs/default_grounding_config:check" \
    -d '{
    "answerCandidate": "CANDIDATE",
    "facts": [
     {
      "factText": "TEXT_0",
      "attributes": {"ATTRIBUTE_A": "VALUE_A0","ATTRIBUTE_B": "VALUE_B0"}
     },
     {
      "factText": "TEXT_1",
      "attributes": {"ATTRIBUTE_A": "VALUE_A1","ATTRIBUTE_B": "VALUE_B1"}
     },
     {
      "factText": "TEXT_2",
      "attributes": {"ATTRIBUTE_A": "VALUE_A2","ATTRIBUTE_B": "VALUE_B2"}
     },
    ],
    "groundingSpec": {
      "citationThreshold": "CITATION_THRESHOLD",
      "enableAntiCitations": "ENABLE_ANTI_CITATION",
      "antiCitationThreshold": "ANTI_CITATION_THRESHOLD",
    }
    }'
    

    Replace the following:

    • PROJECT_ID: the project number or ID of your Google Cloud project.

    • CANDIDATE: the answer candidate string for which you want to get a support score—for example, Titanic was directed by James Cameron. It was released in 1997. An answer candidate can have a maximum length of 4096 tokens, where a token is defined as a word in a sentence or a period (a punctuation mark used to end the sentence). For example, the sentence "They wore off-the-rack clothes in 2024." is seven tokens long, including six words and a period.

    • TEXT: the text segment to be used for grounding—for example, Titanic is a 1997 American epic... Academy Awards. (See the full text in Examples of facts.)

    • ATTRIBUTE: the name of a metadata attribute associated with the fact—for example, author or title. It is a user-defined label to add more information to the fact text. For example, if the fact text Toronto is the capital of Ontario has an author attribute with its value as Wikipedia, then the following claims are well-grounded:

      • Wikipedia cites that Toronto is the capital of Ontario
      • Toronto is the capital of Ontario

      However, the claim that Government of Ontario claims that Toronto is the capital of Ontario is not as well-grounded.

    • VALUE: the value for the attribute—for example, Simple Wikipedia or Titanic (1997 film).

    • CITATION_THRESHOLD: a float value from 0 through 1 that determines whether a fact must be cited for a claim in the answer candidate. A higher threshold leads to fewer but strong citations to support the claim and a lower threshold leads to more but weak citations to support the claim. If unset, the default threshold value is 0.6.

    • ENABLE_ANTI_CITATION: a boolean value. Set this field to true to enable the experimental feature to evaluate the contradiction score. Either remove this field or set this field to false to turn off this feature.

    • ANTI_CITATION_THRESHOLD: a float value from 0 through 1 that determines whether a fact must be cited as contradicting a claim in the answer candidate. A higher threshold leads to fewer but stronger citations that contradict the claim, and a lower threshold leads to more but weaker citations that contradict the claim. If unset, the default threshold value is 0.8.

Example request

Using the example facts from the previous section, you can send the following request. Replace the CANDIDATE field with different strings whose grounding and contradictions you want to check.

    curl -X POST \
    -H "Authorization: Bearer $(gcloud auth print-access-token)" \
    -H "Content-Type: application/json" \
    "https://discoveryengine.googleapis.com/v1alpha/projects/PROJECT_ID/locations/global/groundingConfigs/default_grounding_config:check" \
    -d '{
    "answerCandidate": "CANDIDATE",
    "facts": [
     {
      "factText": "Titanic is a 1997 American epic romantic disaster movie. It was directed, written, and co-produced by James Cameron. The movie is about the 1912 sinking of the RMS Titanic. It stars Kate Winslet and Leonardo DiCaprio. The movie was released on December 19, 1997. It received positive critical reviews. The movie won 11 Academy Awards, and was nominated for fourteen total Academy Awards.",
      "attributes": {"author":"Simple Wikipedia"}
     },
     {
      "factText": "James Cameron's \"Titanic\" is an epic, action-packed romance set against the ill-fated maiden voyage of the R.M.S. Titanic; the pride and joy of the White Star Line and, at the time, the largest moving object ever built. She was the most luxurious liner of her era -- the \"ship of dreams\" -- which ultimately carried over 1,500 people to their death in the ice cold waters of the North Atlantic in the early hours of April 15, 1912.",
      "attributes": {"author":"Simple Wikipedia"}
     }
    ],
    "groundingSpec": {
      "citationThreshold": "0.6",
      "enableAntiCitations": true,
      "antiCitationThreshold": "0.8",
    }
    }'

Example of responses with contradictions

The following table shows an example answer candidate and its response when you send the example request, based on the example facts.

Answer candidate Check grounding response
Titanic was directed by James Cameron. It starred Brad Pitt and Kate Winslet Support score: 0.36

Contradiction score: 0.49

Cited chunks:
  1. [From FACT 0]....Titanic is a 1997 American epic romantic disaster movie. It was directed, written, and co-produced by James Cameron. It stars Kate Winslet and Leonardo DiCaprio. The movie was released on December 19, 1997. It received positive critical reviews. The movie won 11 Academy Awards, and was nominated for fourteen total Academy Awards.....
  2. [From FACT 1]...Titanic is an epic, action-packed romance set against the ill-fated maiden voyage of the R.M.S. Titanic; the pride and joy of the White Star Line and, at the time, the largest moving object ever built. She was the most luxurious liner of her era which ultimately carried over 1,500 people to their death in the ice cold waters of the North Atlantic in the early hours of April 15, 1912....
Claims and citations:
  • 0. Titanic was directed by James Cameron.
    Citations: [0] Anti-citations: []
    Grounding check required: true
  • 1. It starred Brad Pitt and Kate Winslet
    Citations: [] Anti-citations: [0]
    Grounding check required: true

Obtain a helpfulness score for an answer candidate

To try this Experimental feature, contact your Google account team and ask to be added to the allowlist.

In addition to the support score and the contradiction score, the check grounding API can provide a helpfulness score. A helpful response is one that effectively fulfills the user's request (as stated in the prompt) in an informative way. The helpfulness score is a measure of how well the response does the following:

  • Addresses the core intent of the prompt
  • Provides complete details while being concise
  • Directly answers the question asked or completes the task requested in the prompt
  • Offers relevant information
  • Is clear and straightforward
  • Avoids unnecessary details and jargon

To obtain a helpfulness score alongside the grounding score, you must provide a prompt together with the answer candidates and facts. The check grounding API reviews the answer candidate with the prompt and gives a score that indicates how helpfully the answer candidate answers the prompt. The score is in the range of [0,1] where the larger the score, the more helpful the answer.

Get a helpfulness score

To obtain the helpfulness score, follow these steps:

  1. Prepare your prompt and answer candidate.

  2. Call the check method, using the following curl command:

    curl -X POST \
    -H "Authorization: Bearer $(gcloud auth print-access-token)" \
    -H "Content-Type: application/json" \
    "https://discoveryengine.googleapis.com/v1alpha/projects/PROJECT_ID/locations/global/groundingConfigs/default_grounding_config:check" \
    -d '{
    "answerCandidate": "CANDIDATE",
    "facts": [
    {
      "factText": "TEXT_0",
      "attributes": {"ATTRIBUTE_A": "VALUE_A0","ATTRIBUTE_B": "VALUE_B0"}
    },
    {
      "factText": "TEXT_1",
      "attributes": {"ATTRIBUTE_A": "VALUE_A1","ATTRIBUTE_B": "VALUE_B1"}
    },
    {
      "factText": "TEXT_2",
      "attributes": {"ATTRIBUTE_A": "VALUE_A2","ATTRIBUTE_B": "VALUE_B2"}
    }
    ],
    "groundingSpec": {
      "enableHelpfulnessScore": true
    },
    "prompt": "PROMPT",
    }'
    

    Replace the following:

    • PROJECT_ID: the project number or ID of your Google Cloud project.

    • CANDIDATE: the answer candidate string for which you want to get a helpfulness score—for example, Titanic was directed by James Cameron. It was released in 1997. An answer candidate can have a maximum length of 4096 tokens.

    • TEXT: the text segment to be used for grounding—for example, Titanic is a 1997 American epic... Academy Awards. (See the full text in Examples of facts.)

    • ATTRIBUTE: the name of a metadata attribute associated with the fact—for example, author or title. It is a user-defined label to add more information to the fact text. For example, if the fact text Toronto is the capital of Ontario has an author attribute with its value as Wikipedia, then the following claims are well-grounded:

      • Wikipedia cites that Toronto is the capital of Ontario
      • Toronto is the capital of Ontario

      However, the claim that Government of Ontario claims that Toronto is the capital of Ontario is not as well-grounded.

    • VALUE: the value for the attribute—for example, Simple Wikipedia or Titanic (1997 film).

    • PROMPT: The prompt is the query that the answer candidate has been generated to answer—for example, Who directed and starred in the movie Titanic?.

Example request

Using the example facts from the previous section, you can send the following request. Replace the CANDIDATE field with different answer candidates to get a helpfulness score for the answer.

    curl -X POST \
    -H "Authorization: Bearer $(gcloud auth print-access-token)" \
    -H "Content-Type: application/json" \
    "https://discoveryengine.googleapis.com/v1alpha/projects/PROJECT_ID/locations/global/groundingConfigs/default_grounding_config:check" \
    -d '{
    "answerCandidate": "CANDIDATE",
    "facts": [
     {
      "factText": "Titanic is a 1997 American epic romantic disaster movie. It was directed, written, and co-produced by James Cameron. The movie is about the 1912 sinking of the RMS Titanic. It stars Kate Winslet and Leonardo DiCaprio. The movie was released on December 19, 1997. It received positive critical reviews. The movie won 11 Academy Awards, and was nominated for fourteen total Academy Awards.",
      "attributes": {"author":"Simple Wikipedia"}
     },
     {
      "factText": "James Cameron's \"Titanic\" is an epic, action-packed romance set against the ill-fated maiden voyage of the R.M.S. Titanic; the pride and joy of the White Star Line and, at the time, the largest moving object ever built. She was the most luxurious liner of her era -- the \"ship of dreams\" -- which ultimately carried over 1,500 people to their death in the ice cold waters of the North Atlantic in the early hours of April 15, 1912.",
      "attributes": {"author":"Simple Wikipedia"}
     }
    ],
    "groundingSpec": {
      "enableHelpfulnessScore": true
    },
    "prompt": "Who directed and starred in the movie Titanic?"
    }'

Example of responses with helpfulness scores

The following table shows examples of answer candidates with their helpfulness scores. In each case, the prompt is Who directed and starred in the Titanic?

Answer candidate Helpfulness score Score explained
Titanic was directed by James Cameron. It starred Leonardo DiCaprio and Kate Winslet. 0.980 Concise and complete score
Cameron, DiCaprio and Winslet. 0.947 Incomplete
James Cameron's 1997 masterpiece, Titanic, captured the hearts of audiences worldwide with its tragic love story set against the backdrop of the ill-fated maiden voyage of the "unsinkable" ship. The film, a mesmerizing blend of historical drama and fictional romance, starred Leonardo DiCaprio as Jack Dawson, a penniless artist who falls for Rose DeWitt Bukater, a young woman trapped by her social standing and played exquisitely by Kate Winslet. Their passionate love affair unfolds amidst the grandeur and opulence of the Titanic, a floating palace of dreams that ultimately succumbs to a devastating fate. 0.738 Not concise

Obtain claim-level scores for an answer candidate

In addition to the answer-level support score, you can obtain a claim-level support score for each claim in an answer candidate.

To try this Experimental feature, contact your Google account team and ask to be added to the allowlist.

To obtain the claim-level scores, follow these steps:

  1. Prepare your set of facts. For more information and examples, see Terms defined and explained.

  2. Call the check method, using the following curl command:

    curl -X POST \
    -H "Authorization: Bearer $(gcloud auth print-access-token)" \
    -H "Content-Type: application/json" \
    "https://discoveryengine.googleapis.com/v1alpha/projects/PROJECT_ID/locations/global/groundingConfigs/default_grounding_config:check" \
    -d '{
    "answerCandidate": "CANDIDATE",
    "facts": [
     {
      "factText": "TEXT_0",
      "attributes": {"ATTRIBUTE_A": "VALUE_A0","ATTRIBUTE_B": "VALUE_B0"}
     },
     {
      "factText": "TEXT_1",
      "attributes": {"ATTRIBUTE_A": "VALUE_A1","ATTRIBUTE_B": "VALUE_B1"}
     },
     {
      "factText": "TEXT_2",
      "attributes": {"ATTRIBUTE_A": "VALUE_A2","ATTRIBUTE_B": "VALUE_B2"}
     },
    ],
    "groundingSpec": {
      "citationThreshold": "CITATION_THRESHOLD",
      "enableClaimLevelScore": "ENABLE_CLAIM_LEVEL_SCORE",
    }
    }'
    

    Replace the following:

    • PROJECT_ID: the project number or ID of your Google Cloud project.

    • CANDIDATE: the answer candidate string for which you want to get a support score—for example, Titanic was directed by James Cameron. It was released in 1997. An answer candidate can have a maximum length of 4096 tokens, where a token is defined as a word in a sentence or a period (a punctuation mark used to end the sentence). For example, the sentence "They wore off-the-rack clothes in 2024." is seven tokens long, including six words and a period.

    • TEXT: the text segment to be used for grounding—for example, Titanic is a 1997 American epic... Academy Awards. (See the full text in Examples of facts.)

    • ATTRIBUTE: the name of a metadata attribute associated with the fact—for example, author or title. It is a user-defined label to add more information to the fact text. For example, if the fact text Toronto is the capital of Ontario has an author attribute with its value as Wikipedia, then the following claims are well-grounded:

      • Wikipedia cites that Toronto is the capital of Ontario
      • Toronto is the capital of Ontario

      However, the claim that Government of Ontario claims that Toronto is the capital of Ontario is not as well-grounded.

    • VALUE: the value for the attribute—for example, Simple Wikipedia or Titanic (1997 film).

    • CITATION_THRESHOLD: a float value from 0 through 1 that determines whether a fact must be cited for a claim in the answer candidate. A higher threshold leads to fewer but strong citations to support the claim and a lower threshold leads to more but weak citations to support the claim. If unset, the default threshold value is 0.6.

    • ENABLE_CLAIM_LEVEL_SCORE: a boolean value. Set this field to true to enable the claim-level score feature. To turn off this feature, remove this field or set this field to false.

Example request

Using the example facts from the previous section, you can send the following request. Replace the CANDIDATE field with different strings whose per-claim grounding you want to check.

    curl -X POST \
    -H "Authorization: Bearer $(gcloud auth print-access-token)" \
    -H "Content-Type: application/json" \
    "https://discoveryengine.googleapis.com/v1alpha/projects/PROJECT_ID/locations/global/groundingConfigs/default_grounding_config:check" \
    -d '{
    "answerCandidate": "CANDIDATE",
    "facts": [
     {
      "factText": "Titanic is a 1997 American epic romantic disaster movie. It was directed, written, and co-produced by James Cameron. The movie is about the 1912 sinking of the RMS Titanic. It stars Kate Winslet and Leonardo DiCaprio. The movie was released on December 19, 1997. It received positive critical reviews. The movie won 11 Academy Awards, and was nominated for fourteen total Academy Awards.",
      "attributes": {"author":"Simple Wikipedia"}
     },
     {
      "factText": "James Cameron's \"Titanic\" is an epic, action-packed romance set against the ill-fated maiden voyage of the R.M.S. Titanic; the pride and joy of the White Star Line and, at the time, the largest moving object ever built. She was the most luxurious liner of her era -- the \"ship of dreams\" -- which ultimately carried over 1,500 people to their death in the ice cold waters of the North Atlantic in the early hours of April 15, 1912.",
      "attributes": {"author":"Simple Wikipedia"}
     }
    ],
    "groundingSpec": {
      "citationThreshold": "0.6",
      "enableClaimLevelScore": true,
    }
    }'

Example of responses with claim-level scores

The following table shows an example answer candidate and its response when you send the example request, based on the example facts.

Answer candidate Check grounding response
Here is what I found. Titanic was directed by James Cameron. It starred Kate Winslet and Leonardo DiCaprio. Support score: 0.99

Cited chunks:
  1. [From FACT 0]....Titanic is a 1997 American epic romantic disaster movie. It was directed, written, and co-produced by James Cameron. It stars Kate Winslet and Leonardo DiCaprio. The movie was released on December 19, 1997. It received positive critical reviews. The movie won 11 Academy Awards, and was nominated for fourteen total Academy Awards.....
Claims and citations:
  • 0. Here is what I found.
    Grounding check required: false
  • 1. Titanic was directed by James Cameron.
    Citations: [0]
    Grounding check required: true
    Score: 0.99
  • 2. It starred Kate Winslet and Leonardo DiCaprio.
    Citations: [0]
    Grounding check required: true
    Score: 0.99