Make predictions with imported TensorFlow models

This page shows you how to import TensorFlow models into a BigQuery ML dataset and use them to make predictions from a SQL query. You can import TensorFlow models using these interfaces:

  • The Google Cloud console
  • The bq query command in the bq command-line tool
  • The BigQuery API

For more information about importing TensorFlow models into BigQuery ML, including format and storage requirements, see The CREATE MODEL statement for importing TensorFlow models.

Import TensorFlow models

To import TensorFlow models into a dataset, follow these steps:

Console

  1. In the Google Cloud console, go to the BigQuery page.

    Go to the BigQuery page

  2. In the query editor, enter a CREATE MODEL statement like the following.

     CREATE OR REPLACE MODEL `example_dataset.imported_tf_model`
      OPTIONS (MODEL_TYPE='TENSORFLOW',
       MODEL_PATH='gs://cloud-training-demos/txtclass/export/exporter/1549825580/*')

    The preceding query imports a model located at gs://cloud-training-demos/txtclass/export/exporter/1549825580/* as a BigQuery ML model named imported_tf_model. The Cloud Storage URI ends in a wildcard character (*) so that BigQuery ML also imports any assets associated with the model. The imported model is a TensorFlow text classifier model that predicts which website published a given article title.

  3. Your new model should now appear in the Resources panel. As you expand each of the datasets in a project, models are listed along with the other BigQuery resources in the datasets. Models are indicated by the model icon: model icon .

  4. If you select the new model in the Resources panel, information about the model appears below the Query editor.

    TensorFlow model info

bq

To import a TensorFlow model from Cloud Storage, run a batch query by entering a command like the following:

bq query \
--use_legacy_sql=false \
"CREATE MODEL
  `mydataset.mymodel`
OPTIONS
  (MODEL_TYPE='TENSORFLOW',
   MODEL_PATH='gs://bucket/path/to/saved_model/*')"

For example:

bq query --use_legacy_sql=false \
"CREATE OR REPLACE MODEL
  `example_dataset.imported_tf_model`
OPTIONS
  (MODEL_TYPE='TENSORFLOW',
    MODEL_PATH='gs://cloud-training-demos/txtclass/export/exporter/1549825580/*')"

After importing the model, it should appear in the output of bq ls [dataset_name]:

$ bq ls example_dataset

       tableId        Type    Labels   Time Partitioning
 ------------------- ------- -------- -------------------
  imported_tf_model   MODEL

API

Insert a new job and populate the jobs#configuration.query property as in the following request body:

{
  "query": "CREATE MODEL `project_id:mydataset.mymodel` OPTIONS(MODEL_TYPE='TENSORFLOW' MODEL_PATH='gs://bucket/path/to/saved_model/*')"
}

BigQuery DataFrames

Before trying this sample, follow the BigQuery DataFrames setup instructions in the BigQuery quickstart using BigQuery DataFrames. For more information, see the BigQuery DataFrames reference documentation.

To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for a local development environment.

Import the model by using the TensorFlowModel object.

import bigframes
from bigframes.ml.imported import TensorFlowModel

bigframes.options.bigquery.project = PROJECT_ID
# You can change the location to one of the valid locations: https://cloud.google.com/bigquery/docs/locations#supported_locations
bigframes.options.bigquery.location = "US"

imported_tensorflow_model = TensorFlowModel(
    model_path="gs://cloud-training-demos/txtclass/export/exporter/1549825580/*"
)

Make predictions with imported TensorFlow models

To make predictions with imported TensorFlow models, follow these steps. The following examples assume you've imported the TensorFlow model as you did in the preceding example.

Console

  1. In the Google Cloud console, go to the BigQuery page.

    Go to the BigQuery page

  2. In the query editor, enter a query using ML.PREDICT like the following.

     SELECT *
       FROM ML.PREDICT(MODEL example_dataset.imported_tf_model,
         (
          SELECT title AS input
          FROM bigquery-public-data.hacker_news.full
         )
     )
     

    The preceding query uses the model named imported_tf_model in the dataset example_dataset in the current project to make predictions from input data in the public table full from the dataset hacker_news in the project bigquery-public-data. In this case, the TensorFlow model's serving_input_fn function specifies that the model expects a single input string named input, so the subquery assigns the alias input to the column in the subquery's SELECT statement.

    This query outputs results like the following. In this example, the model outputs the column dense_1, which contains an array of probability values, as well as an input column, which contains the corresponding string values from the input table. Each array element value represents the probability that the corresponding input string is an article title from a particular publication.

    Query results

bq

To make predictions from input data in the table input_data, enter a command like the following, using the imported TensorFlow model my_model:

bq query \
--use_legacy_sql=false \
'SELECT *
 FROM ML.PREDICT(
   MODEL `my_project.my_dataset.my_model`,
   (SELECT * FROM input_data))'

For example:

bq query \
--use_legacy_sql=false \
'SELECT *
FROM ML.PREDICT(
  MODEL `tensorflow_sample.imported_tf_model`,
  (SELECT title AS input FROM `bigquery-public-data.hacker_news.full`))'

This example returns results like the following:

    +------------------------------------------------------------------------+----------------------------------------------------------------------------------+
    |                               dense_1                                  |                                       input                                      |
    +------------------------------------------------------------------------+----------------------------------------------------------------------------------+
    |   ["0.6251608729362488","0.2989124357700348","0.07592673599720001"]    | How Red Hat Decides Which Open Source Companies t...                             |
    |   ["0.014276246540248394","0.972910463809967","0.01281337533146143"]   | Ask HN: Toronto/GTA mastermind around side income for big corp. dev?             |
    |   ["0.9821603298187256","1.8601855117594823E-5","0.01782100833952427"] | Ask HN: What are good resources on strategy and decision making for your career? |
    |   ["0.8611106276512146","0.06648492068052292","0.07240450382232666"]   | Forget about promises, use harvests                                              |
    +------------------------------------------------------------------------+----------------------------------------------------------------------------------+

API

Insert a new job and populate the jobs#configuration.query property as in the following request body:

{
  "query": "SELECT * FROM ML.PREDICT(MODEL `my_project.my_dataset.my_model`, (SELECT * FROM input_data))"
}

BigQuery DataFrames

Before trying this sample, follow the BigQuery DataFrames setup instructions in the BigQuery quickstart using BigQuery DataFrames. For more information, see the BigQuery DataFrames reference documentation.

To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for a local development environment.

Use the predict function to run the remote model:

import bigframes.pandas as bpd

df = bpd.read_gbq("bigquery-public-data.hacker_news.full")
df_pred = df.rename(columns={"title": "input"})
predictions = imported_tensorflow_model.predict(df_pred)
predictions.head(5)

The result is similar to the following: Result_visualization

What's next