Get data insights from a contribution analysis model using a summable metric
In this tutorial, you use a contribution analysis model to analyze sales changes between 2020 and 2021 in the Iowa liquor sales dataset. This tutorial guides you through performing the following tasks:
- Create an input table based on publicly available Iowa liquor data.
- Create a contribution analysis model that uses a summable metric. This type of model summarizes a given metric for a combination of one or more dimensions in the data, to determine how those dimensions contribute to the metric value.
- Get the metric insights from the model by using the
ML.GET_INSIGHTS
function.
Before starting this tutorial, you should be familiar with the contribution analysis use case.
Required permissions
To create the dataset, you need the
bigquery.datasets.create
Identity and Access Management (IAM) permission.To create the model, you need the following permissions:
bigquery.jobs.create
bigquery.models.create
bigquery.models.getData
bigquery.models.updateData
To run inference, you need the following permissions:
bigquery.models.getData
bigquery.jobs.create
Costs
In this document, you use the following billable components of Google Cloud:
- BigQuery ML: You incur costs for the data that you process in BigQuery.
To generate a cost estimate based on your projected usage,
use the pricing calculator.
For more information about BigQuery pricing, see BigQuery pricing in the BigQuery documentation.
Before you begin
-
In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
-
Make sure that billing is enabled for your Google Cloud project.
-
Enable the BigQuery API.
Create a dataset
Create a BigQuery dataset to store your ML model.
Console
In the Google Cloud console, go to the BigQuery page.
In the Explorer pane, click your project name.
Click
View actions > Create dataset.On the Create dataset page, do the following:
For Dataset ID, enter
bqml_tutorial
.For Location type, select Multi-region, and then select US (multiple regions in United States).
Leave the remaining default settings as they are, and click Create dataset.
bq
To create a new dataset, use the
bq mk
command
with the --location
flag. For a full list of possible parameters, see the
bq mk --dataset
command
reference.
Create a dataset named
bqml_tutorial
with the data location set toUS
and a description ofBigQuery ML tutorial dataset
:bq --location=US mk -d \ --description "BigQuery ML tutorial dataset." \ bqml_tutorial
Instead of using the
--dataset
flag, the command uses the-d
shortcut. If you omit-d
and--dataset
, the command defaults to creating a dataset.Confirm that the dataset was created:
bq ls
API
Call the datasets.insert
method with a defined dataset resource.
{ "datasetReference": { "datasetId": "bqml_tutorial" } }
Create a table of input data
Create a table that contains test and control data to analyze. The test table contains liquor data from 2021 and the control table contains liquor data from 2020. The following query combines the test and control data into a single input table:
In the Google Cloud console, go to the BigQuery page.
In the query editor, run the following statement:
CREATE OR REPLACE TABLE bqml_tutorial.iowa_liquor_sales_sum_data AS ( (SELECT store_name, city, vendor_name, category_name, item_description, SUM(sale_dollars) AS total_sales, FALSE AS is_test FROM `bigquery-public-data.iowa_liquor_sales.sales` WHERE EXTRACT(YEAR from date) = 2020 GROUP BY store_name, city, vendor_name, category_name, item_description, is_test) UNION ALL (SELECT store_name, city, vendor_name, category_name, item_description, SUM(sale_dollars) AS total_sales, TRUE AS is_test FROM `bigquery-public-data.iowa_liquor_sales.sales` WHERE EXTRACT (YEAR FROM date) = 2021 GROUP BY store_name, city, vendor_name, category_name, item_description, is_test) );
Create the model
Create a contribution analysis model:
In the Google Cloud console, go to the BigQuery page.
In the query editor, run the following statement:
CREATE OR REPLACE MODEL bqml_tutorial.iowa_liquor_sales_sum_model OPTIONS( model_type='CONTRIBUTION_ANALYSIS', contribution_metric = 'sum(total_sales)', dimension_id_cols = ['store_name', 'city', 'vendor_name', 'category_name', 'item_description'], is_test_col = 'is_test', min_apriori_support=0.05 ) AS SELECT * FROM bqml_tutorial.iowa_liquor_sales_sum_data;
The query takes approximately 60 seconds to complete, after which the model
iowa_liquor_sales_sum_model
appears in the bqml_tutorial
dataset in
the Explorer pane. Because the query uses a CREATE MODEL
statement to
create a model, there are no query results.
Get insights from the model
Get insights generated by the contribution analysis model by using the
ML.GET_INSIGHTS
function.
In the Google Cloud console, go to the BigQuery page.
In the query editor, run the following statement to select columns from the output for a summable metric contribution analysis model:
SELECT contributors, metric_test, metric_control, difference, relative_difference, unexpected_difference, relative_unexpected_difference, apriori_support, contribution FROM ML.GET_INSIGHTS( MODEL `bqml_tutorial.iowa_liquor_sales_sum_model`);
The first several rows of the output should look similar to the following. The values are truncated to improve readability.
contributors | metric_test | metric_control | difference | relative_difference | unexpected_difference | relative_unexpected_difference | apriori_support | contribution |
---|---|---|---|---|---|---|---|---|
all | 428068179 | 396472956 | 31595222 | 0.079 | 31595222 | 0.079 | 1.0 | 31595222 |
vendor_name=SAZERAC COMPANY INC | 52327307 | 38864734 | 13462573 | 0.346 | 11491923 | 0.281 | 0.122 | 13462573 |
city=DES MOINES | 49521322 | 41746773 | 7774549 | 0.186 | 4971158 | 0.111 | 0.115 | 7774549 |
vendor_name=DIAGEO AMERICAS | 84681073 | 77259259 | 7421814 | 0.096 | 1571126 | 0.018 | 0.197 | 7421814 |
category_name=100% AGAVE TEQUILA | 23915100 | 17252174 | 6662926 | 0.386 | 5528662 | 0.3 | 0.055 | 6662926 |
The output is automatically sorted by contribution, or ABS(difference)
, in
descending order. In the all
row, the difference
column shows there was a
$31,595,222 increase in total sales from 2020 to 2021, a 7.9% increase as
indicated by the relative_difference
column. In the second row, with
vendor_name=SAZERAC COMPANY INC
, there was an unexpected_difference
of
$11,491,923, meaning this segment of data grew 28% more than the growth rate of
the data as a whole, as seen from the relative_unexpected_difference
column.
For more information, see the
summable metric output columns.
Clean up
- In the Google Cloud console, go to the Manage resources page.
- In the project list, select the project that you want to delete, and then click Delete.
- In the dialog, type the project ID, and then click Shut down to delete the project.