Stay organized with collections
Save and categorize content based on your preferences.
This document describes how to generate Gemini Code Assist metrics. For
example, you can generate metrics that report the daily active usage or the
acceptance of code recommendations for a variety of Google Cloud products,
including Cloud Logging, Google Cloud CLI, Cloud Monitoring, and
BigQuery.
At the bottom of the Google Cloud console, a
Cloud Shell
session starts and displays a command-line prompt. Cloud Shell is a shell environment
with the Google Cloud CLI
already installed and with values already set for
your current project. It can take a few seconds for the session to initialize.
List the number of unique users
The following instructions describe how to use the gcloud CLI to list
the number of unique users of Gemini Code Assist in the most
recent 28-day period:
In a shell environment, ensure that you have updated all installed components
of the gcloud CLI to the latest version:
gcloudcomponentsupdate
Read the log entries for Gemini Code Assist users and usage:
The following steps show how to use Monitoring to create daily use
graphs that show the aggregate total of daily active Gemini Code Assist
users and the number of their requests per day.
Create a Monitoring metric from your log data that records
the number of Gemini Code Assist users:
In the Google Cloud console, go to the Logs Explorer page:
If you use the search bar to find this page, then select the result whose subheading is
Logging.
In the query pane, enter the following query, and then click
Run query:
resource.type="cloudaicompanion.googleapis.com/Instance" AND labels.product="code_assist" AND jsonPayload.@type="type.googleapis.com/google.cloud.cloudaicompanion.logging.v1.ResponseLog"
In the toolbar, click Actions, and then select Create metric.
The Create log-based metric dialog appears.
Configure the following metric details:
Ensure the Metric Type is set to Counter.
Name the metric code_assist_example.
Ensure Filter selection is set to point to
the location where your logs are being stored, either Project or
Bucket.
Upgrade the log bucket that stores your log data to use
Log Analytics,
and then create a linked BigQuery dataset.
With both approaches, you can use SQL to query and analyze your log data, and
you can chart the results of those queries. If you use Log Analytics,
then you can save your charts to a custom dashboard. However, there are
differences in pricing. For details, see
Log Analytics pricing and
BigQuery pricing.
Optional: To verify that you entered the correct filter, select
Preview logs. The Logs Explorer opens in a new tab with the filter
pre-populated.
Click Create sink.
Authorize the log sink to write log entries to the dataset
When you have Owner access to the BigQuery dataset,
Cloud Logging grants the log sink the necessary permissions to write log
data.
If you don't have Owner access or if you don't see any entries in your
dataset, then the log sink might not have the required permissions. To resolve
this failure, follow the instructions in
Set destination permissions.
Queries
You can use the following sample BigQuery queries to generate
user- and aggregate-level data for daily active use and suggestions generated.
Before using the following sample queries, you must obtain the fully qualified
path for the newly created sink. To obtain the path, do the following:
In the Google Cloud console, go to the BigQuery page.
In the resources list, locate the dataset named code_assist_bq. This
data is the sink destination.
Select the responses table from beneath the code_assist_bq_dataset, click
the more_vert icon, and then click
Copy ID to generate the dataset ID. Make note of it so that you can use
it in the following sections as the GENERATED_BIGQUERY_TABLE variable.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-09-03 UTC."],[[["\u003cp\u003eNew Gemini Code Assist customers without prior subscriptions receive credits for up to 50 free licenses in their first month, regardless of the edition, and Gemini Code Assist Enterprise is currently available for $19 per month per user with a 12-month commitment until March 31, 2025.\u003c/p\u003e\n"],["\u003cp\u003eYou can use the \u003ccode\u003egcloud\u003c/code\u003e CLI to list unique Gemini Code Assist users over the most recent 28-day period by reading and analyzing log entries.\u003c/p\u003e\n"],["\u003cp\u003eMonitoring can be used to create daily usage graphs showing the number of active Gemini Code Assist users and their requests, derived from log data.\u003c/p\u003e\n"],["\u003cp\u003eBigQuery can analyze Gemini Code Assist log data, either through creating a log sink to export the data or by upgrading the log bucket to use Log Analytics and then creating a linked BigQuery dataset, allowing for SQL queries and result charting.\u003c/p\u003e\n"],["\u003cp\u003eSample queries are provided to allow for the ability to identify individual and aggregate users by day, as well as for individual and aggregate requests per day by user.\u003c/p\u003e\n"]]],[],null,["This document describes how to generate Gemini Code Assist metrics. For\nexample, you can generate metrics that report the daily active usage or the\nacceptance of code recommendations for a variety of Google Cloud products,\nincluding Cloud Logging, Google Cloud CLI, Cloud Monitoring, and\nBigQuery.\n\nIf you need to enable and view Gemini for Google Cloud\nprompt, response, and metadata logs, see\n[View Gemini for Google Cloud logs](/gemini/docs/log-gemini).\n\nBefore you begin\n\n- Ensure you have [set up Gemini Code Assist](/gemini/docs/discover/set-up-gemini) in your project.\n- Ensure you have\n [enabled Gemini for Google Cloud logging](/gemini/docs/log-gemini#enable)\n in your project.\n\n-\n\n\n In the Google Cloud console, activate Cloud Shell.\n\n [Activate Cloud Shell](https://console.cloud.google.com/?cloudshell=true)\n\n\n At the bottom of the Google Cloud console, a\n [Cloud Shell](/shell/docs/how-cloud-shell-works)\n session starts and displays a command-line prompt. Cloud Shell is a shell environment\n with the Google Cloud CLI\n already installed and with values already set for\n your current project. It can take a few seconds for the session to initialize.\n\n \u003cbr /\u003e\n\nList the number of unique users\n\nThe following instructions describe how to use the gcloud CLI to list\nthe number of unique users of Gemini Code Assist in the most\nrecent 28-day period:\n\n1. In a shell environment, ensure that you have updated all installed components\n of the [gcloud CLI](/sdk/gcloud) to the latest version:\n\n gcloud components update\n\n2. Read the log entries for Gemini Code Assist users and usage:\n\n gcloud logging read 'resource.type=cloudaicompanion.googleapis.com/Instance labels.product=~\"code_assist\"' \\\n --freshness 28d \\\n --project \u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e \\\n --format \"csv(timestamp.date('%Y-%m-%d'),labels.user_id)\"\n\n Replace \u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e with your Google Cloud project ID.\n\n You can use the Unix command `uniq` to uniquely identify users on a per-day\n basis.\n\n The output is similar to the following: \n\n 2024-10-30,user1@company.com\n 2024-10-29,user2@company.com\n 2024-10-29,user2@company.com\n 2024-10-29,user2@company.com\n 2024-10-29,user1@company.com\n 2024-10-28,user1@company.com\n\nCreate a chart that displays daily usage\n\nThe following steps show how to use Monitoring to create daily use\ngraphs that show the aggregate total of daily active Gemini Code Assist\nusers and the number of their requests per day.\n\n1. Create a Monitoring metric from your log data that records\n the number of Gemini Code Assist users:\n\n 1. In the Google Cloud console, go to the **Logs Explorer** page:\n\n [Go to **Logs Explorer**](https://console.cloud.google.com/logs/query)\n\n \u003cbr /\u003e\n\n If you use the search bar to find this page, then select the result whose subheading is\n **Logging**.\n 2. In the query pane, enter the following query, and then click\n **Run query**:\n\n resource.type=\"cloudaicompanion.googleapis.com/Instance\" AND labels.product=\"code_assist\" AND jsonPayload.@type=\"type.googleapis.com/google.cloud.cloudaicompanion.logging.v1.ResponseLog\"\n\n | **Note:** The default time period value is **Last 1 hour** , but you can set it to a longer time period (such as **Last 7 days**).\n 3. In the toolbar, click **Actions** , and then select **Create metric**.\n\n The **Create log-based metric** dialog appears.\n 4. Configure the following metric details:\n\n - Ensure the **Metric Type** is set to **Counter**.\n - Name the metric `code_assist_example`.\n - Ensure **Filter selection** is set to point to\n the location where your logs are being stored, either **Project** or\n **Bucket**.\n\n For information about generating Monitoring metrics from\n your log data, see\n [Log-based metrics overview](/logging/docs/logs-based-metrics).\n 5. Click **Create metric**.\n\n A success banner is displayed, explaining the metric was created.\n 6. In that success banner, click **View in Metrics explorer**.\n\n Metrics Explorer opens and displays a preconfigured chart.\n | **Note:** It may take up to 10 minutes for data to populate on the chart. For more information, see [Metric is missing logs data](/logging/docs/logs-based-metrics/troubleshooting#slow-startup).\n2. Save the chart to a dashboard:\n\n 1. In the toolbar, click **Save chart**.\n 2. Optional: Update the chart title.\n 3. Use the **Dashboard** menu either to select an existing custom dashboard or to create a new dashboard.\n 4. Click **Save chart**.\n\nAnalyze usage by using BigQuery\n\nThe following steps show how to use BigQuery to analyze your\nlog data.\n\nThere are two approaches that you can use to analyze your log data in\nBigQuery:\n\n- [Create a log sink](#create-sink) and export your log data to a BigQuery dataset.\n- Upgrade the log bucket that stores your log data to use [Log Analytics](/logging/docs/log-analytics#analytics), and then create a linked BigQuery dataset.\n\nWith both approaches, you can use SQL to query and analyze your log data, and\nyou can chart the results of those queries. If you use Log Analytics,\nthen you can save your charts to a custom dashboard. However, there are\ndifferences in pricing. For details, see\n[Log Analytics pricing](/stackdriver/pricing#logging-costs) and\n[BigQuery pricing](/bigquery/pricing).\n\nThis section describes how to create a log sink to export select log entries\nto BigQuery, and it provides a list of sample queries.\nIf you want to know more about Log Analytics, see\n[Query and analyze logs with Log Analytics](/logging/docs/analyze/query-and-view)\nand [Query a linked BigQuery dataset](/logging/docs/analyze/query-linked-dataset).\n\nCreate a log sink\n\n1. In the Google Cloud console, go to the **Log Router** page:\n\n [Go to **Log Router**](https://console.cloud.google.com/logs/router)\n\n \u003cbr /\u003e\n\n If you use the search bar to find this page, then select the result whose subheading is\n **Logging**.\n2. Select the Google Cloud project in which the log entries that you want to route originate.\n3. Select **Create sink**.\n4. In the **Sink details** panel, enter the following details:\n\n - For **Sink name**, provide an identifier for the sink. After you create\n the sink, you can't rename the sink but you can delete it and create a new\n sink.\n\n - For **Sink description**, describe the purpose or use case for the sink.\n\n5. In the **Sink destination** panel, configure the following details:\n\n - For **Select sink service** , select **BigQuery dataset**.\n - For **Select BigQuery dataset** , create a new BigQuery dataset and name it `code_assist_bq`.\n6. Open the **Choose logs to include in sink** panel, and in the\n **Build inclusion filter** field, enter the following:\n\n resource.type=\"cloudaicompanion.googleapis.com/Instance\" AND labels.product=\"code_assist\"\n\n7. Optional: To verify that you entered the correct filter, select\n **Preview logs**. The Logs Explorer opens in a new tab with the filter\n pre-populated.\n\n8. Click **Create sink**.\n\nAuthorize the log sink to write log entries to the dataset\n\nWhen you have Owner access to the BigQuery dataset,\nCloud Logging grants the log sink the necessary permissions to write log\ndata.\n\nIf you don't have Owner access or if you don't see any entries in your\ndataset, then the log sink might not have the required permissions. To resolve\nthis failure, follow the instructions in\n[Set destination permissions](/logging/docs/export/configure_export_v2#dest-auth).\n\nQueries\n\nYou can use the following sample BigQuery queries to generate\nuser- and aggregate-level data for daily active use and suggestions generated.\n\nBefore using the following sample queries, you must obtain the fully qualified\npath for the [newly created sink](#create-sink). To obtain the path, do the following:\n\n1. In the Google Cloud console, go to the **BigQuery** page.\n\n [Go to BigQuery](https://console.cloud.google.com/bigquery)\n2. In the resources list, locate the dataset named `code_assist_bq`. This\n data is the [sink destination](#sink_destination).\n\n3. Select the responses table from beneath the `code_assist_bq_dataset`, click\n the more_vert icon, and then click\n **Copy ID** to generate the dataset ID. Make note of it so that you can use\n it in the following sections as the \u003cvar translate=\"no\"\u003eGENERATED_BIGQUERY_TABLE\u003c/var\u003e variable.\n\nList individual users by day \n\n SELECT DISTINCT labels.user_id as user, DATE(timestamp) as use_date\n FROM \u003cvar translate=\"no\"\u003e\u003cspan class=\"devsite-syntax-n\"\u003eGENERATED_BIGQUERY_TABLE\u003c/span\u003e\u003c/var\u003e\n ORDER BY use_date\n\nReplace \u003cvar translate=\"no\"\u003eGENERATED_BIGQUERY_TABLE\u003c/var\u003e with the fully qualified path of the\nBigQuery response table you noted in the\n[previous steps for creating a sink](#create-sink).\n\nList aggregate users by day \n\n SELECT COUNT(DISTINCT labels.user_id) as total_users, DATE(timestamp) as use_date\n FROM \u003cvar translate=\"no\"\u003e\u003cspan class=\"devsite-syntax-n\"\u003eGENERATED_BIGQUERY_TABLE\u003c/span\u003e\u003c/var\u003e\n GROUP BY use_date\n ORDER BY use_date\n\nList individual requests per day by user \n\n SELECT COUNT(*), DATE(timestamp) as use_date, labels.user_id as user\n FROM \u003cvar translate=\"no\"\u003e\u003cspan class=\"devsite-syntax-n\"\u003eGENERATED_BIGQUERY_TABLE\u003c/span\u003e\u003c/var\u003e\n GROUP BY use_date, user\n ORDER BY use_date\n\nList aggregate requests per day by date \n\n SELECT COUNT(*), DATE(timestamp) as use_date\n FROM \u003cvar translate=\"no\"\u003e\u003cspan class=\"devsite-syntax-n\"\u003eGENERATED_BIGQUERY_TABLE\u003c/span\u003e\u003c/var\u003e\n GROUP BY use_date\n ORDER BY use_date\n\nWhat's next\n\n- Learn more about [Gemini for Google Cloud logging](/gemini/docs/log-gemini).\n- Learn more about [Gemini for Google Cloud monitoring](/gemini/docs/monitor-gemini)."]]