Publish events to a BigQuery table

This quickstart shows you how to publish and receive event messages by creating an Eventarc Advanced bus and enrollment in your Google Cloud project.

  • A bus lets you centralize the flow of messages through your system, and acts as a router. It receives event messages from a message source or published by a provider, and evaluates them according to an enrollment.

  • An enrollment identifies a subscription to a particular bus, and defines the matching criteria for messages, causing them to be routed accordingly to one or more destinations.

In this quickstart, you:

  1. Create a BigQuery table.

  2. Create an Eventarc Advanced bus.

  3. Create an Eventarc Advanced enrollment.

  4. Publish an event message to the bus.

  5. View the event data in the BigQuery table.

You can complete this quickstart using the gcloud CLI and the bq command-line tool.

Before you begin

Security constraints defined by your organization might prevent you from completing the following steps. For troubleshooting information, see Develop applications in a constrained Google Cloud environment.

  1. Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
  2. Install the Google Cloud CLI.

  3. If you're using an external identity provider (IdP), you must first sign in to the gcloud CLI with your federated identity.

  4. To initialize the gcloud CLI, run the following command:

    gcloud init
  5. Create or select a Google Cloud project.

    • Create a Google Cloud project:

      gcloud projects create PROJECT_ID

      Replace PROJECT_ID with a name for the Google Cloud project you are creating.

    • Select the Google Cloud project that you created:

      gcloud config set project PROJECT_ID

      Replace PROJECT_ID with your Google Cloud project name.

  6. Verify that billing is enabled for your Google Cloud project.

  7. Enable the BigQuery and Eventarc APIs:

    gcloud services enable bigquery.googleapis.com eventarc.googleapis.com eventarcpublishing.googleapis.com
  8. Install the Google Cloud CLI.

  9. If you're using an external identity provider (IdP), you must first sign in to the gcloud CLI with your federated identity.

  10. To initialize the gcloud CLI, run the following command:

    gcloud init
  11. Create or select a Google Cloud project.

    • Create a Google Cloud project:

      gcloud projects create PROJECT_ID

      Replace PROJECT_ID with a name for the Google Cloud project you are creating.

    • Select the Google Cloud project that you created:

      gcloud config set project PROJECT_ID

      Replace PROJECT_ID with your Google Cloud project name.

  12. Verify that billing is enabled for your Google Cloud project.

  13. Enable the BigQuery and Eventarc APIs:

    gcloud services enable bigquery.googleapis.com eventarc.googleapis.com eventarcpublishing.googleapis.com
  14. Update gcloud components:
    gcloud components update
  15. Sign in using your account:
    gcloud auth login
  16. Set the configuration variable used in this quickstart:
    REGION=REGION

    Replace REGION with a supported location for the bus—for example, us-central1.

  17. If you are the project creator, you are granted the basic Owner role (roles/owner). By default, this Identity and Access Management (IAM) role includes the permissions necessary for full access to most Google Cloud resources and you can skip this step.

    If you are not the project creator, required permissions must be granted on the project to the appropriate principal. For example, a principal can be a Google Account (for end users) or a service account (for applications and compute workloads).

    Required permissions

    To get the permissions that you need to complete this quickstart, ask your administrator to grant you the following IAM roles on your project:

    For more information about granting roles, see Manage access to projects, folders, and organizations.

    You might also be able to get the required permissions through custom roles or other predefined roles.

  18. To give Eventarc Advanced the necessary permissions to update BigQuery table properties, ask your administrator to grant the BigQuery Data Editor (roles/bigquery.dataEditor) IAM role on your Google Cloud project to a service account:
    1. Create a service account. For testing purposes, you will attach this service account to an Eventarc Advanced pipeline to represent the identity of the pipeline.
      gcloud iam service-accounts create SERVICE_ACCOUNT_NAME
      Replace SERVICE_ACCOUNT_NAME with a name for your service account.
    2. Grant the roles/bigquery.dataEditor IAM role to the service account:
      gcloud projects add-iam-policy-binding PROJECT_ID \
          --member="serviceAccount:SERVICE_ACCOUNT_NAME@PROJECT_ID.iam.gserviceaccount.com" \
          --role=roles/bigquery.dataEditor

Create a BigQuery table

Create a BigQuery table as your event destination. Other event destinations are supported such as a Pub/Sub topic, Workflows, or another HTTP endpoint. For more information, see Event providers and destinations.

Before creating a BigQuery table, create a dataset which acts as a top-level container for the table, and a table schema.

  1. To create a new dataset, use the bq mk command with the --dataset flag.

    bq --location=$REGION mk --dataset DATASET_ID

    Replace DATASET_ID with a unique name for the BigQuery dataset—for example, my_dataset.

  2. In your terminal, create a new file called my-schema.json.

  3. Copy and paste the following schema into the new file, and then save the file.

    [
        {
            "name": "name",
            "type": "STRING",
            "mode": "REQUIRED"
        },
        {
            "name": "age",
            "type": "INTEGER",
            "mode": "NULLABLE"
        }
    ]
  4. To create a table, use the bq mk command with the --table flag.

    bq mk --table PROJECT_ID:DATASET_ID.TABLE_ID my-schema.json

    Replace TABLE_ID with a unique name for the BigQuery table—for example, my-table.

Create an Eventarc Advanced bus

A bus receives event messages from a message source or published by a provider and acts as a message router.

For more information, see Create a bus to route messages.

Create an Eventarc Advanced bus in your project by using the gcloud eventarc message-buses create command:

gcloud eventarc message-buses create BUS_NAME \
    --location=$REGION

Replace BUS_NAME with the ID of your bus or a fully qualified name—for example, my-bus.

Create an Eventarc Advanced enrollment

An enrollment determines which messages are routed to a destination and it also specifies the pipeline that is used to configure a destination for the event messages. In this case, the target destination is a BigQuery API endpoint.

For more information, see Create an enrollment to receive events.

When using the gcloud CLI, you first create a pipeline, and then create an enrollment:

  1. Create a pipeline by using the gcloud eventarc pipelines create command:

    gcloud eventarc pipelines create PIPELINE_NAME \
        --destinations=http_endpoint_uri='https://bigquery.googleapis.com/bigquery/v2/projects/PROJECT_ID/datasets/DATASET_ID/tables/TABLE_ID/insertAll',http_endpoint_message_binding_template='{"headers": headers.merge({"content-type":"application/json"}), "body": {"rows":[{"json":message.data}]}}',oauth_token_authentication_service_account=SERVICE_ACCOUNT_NAME@PROJECT_ID.iam.gserviceaccount.com \
        --input-payload-format-json= \
        --location=$REGION

    Replace the PIPELINE_NAME with the ID of the pipeline or a fully qualified name—for example, my-pipeline.

    Note the following:

    • The http_endpoint_message_binding_template key transforms the event into the format expected by the API. When defining a message binding, you must configure an input format to access the payload.
    • The oauth_token_authentication_service_account key specifies a service account email. This email is used to generate an OAuth token which should generally be used only when calling Google APIs hosted on *.googleapis.com.
    • The input-payload-format-json flag specifies that the pipeline's input payload format is JSON; any messages not matching this format are treated as persistent errors.
  2. Create an enrollment by using the gcloud eventarc enrollments create command:

    gcloud eventarc enrollments create ENROLLMENT_NAME \
        --cel-match=MATCH_EXPRESSION \
        --destination-pipeline=PIPELINE_NAME \
        --message-bus=BUS_NAME \
        --message-bus-project=PROJECT_ID \
        --location=$REGION

    Replace the following:

    • ENROLLMENT_NAME: the ID of the enrollment or a fully qualified name—for example, my-enrollment.
    • MATCH_EXPRESSION: the matching expression for this enrollment using CEL—for example:

      "message.type == 'hello-world-type'"
      

Publish an event message to the bus

To directly publish a message to your bus, you can use the gcloud eventarc message-buses publish command or send a request to the Eventarc Publishing REST API. For more information, see Publish events directly.

The message must be in a CloudEvents format which is a specification for describing event data in a common way. The data element is the payload of your event and it must ultimately match the schema of your BigQuery table. Any well-formed JSON can go in this field. For more information about CloudEvents context attributes, see Event format.

The following are examples of directly publishing an event to an Eventarc Advanced bus:

Example 1

You can publish an event to a bus using the gcloud CLI and an --event-data and other event attribute flags:

gcloud eventarc message-buses publish BUS_NAME \
    --event-data='{"name": "my-name", "age": "20"}' \
    --event-id=hello-world-id-1234 \
    --event-source=hello-world-source \
    --event-type=hello-world-type \
    --event-attributes="datacontenttype=application/json" \
    --location=$REGION

Example 2

You can publish an event to a bus as a JSON message using the gcloud CLI and a --json-message flag:

gcloud eventarc message-buses publish BUS_NAME \
    --location=$REGION \
    --json-message='{"id": "hello-world-id-1234", "type":
 "hello-world-type", "source":
 "hello-world-source", "specversion": "1.0", "data":
 {"name": "my-name", "age": "20"}}'

After publishing an event, you should receive an "Event published successfully" message.

View the event data in the BigQuery table

After publishing an event to your Eventarc Advanced bus, you can use the bq query command to confirm that a new row was added to your BigQuery table.

bq query \
    --use_legacy_sql=false \
    'SELECT
      *
    FROM
      `PROJECT_ID.DATASET_ID.TABLE_ID`
    LIMIT
      10;'

You have successfully created an Eventarc Advanced bus and enrollment, published an event message to the bus, and verified the expected outcome by querying the BigQuery table.

Clean up

When you finish the tasks that are described in this quickstart, you can avoid continued billing by deleting the resources that you created:

  1. Delete a BigQuery table.

  2. Delete a BigQuery dataset.

  3. Delete Eventarc Advanced resources:

    1. Delete an enrollment.

    2. Delete a pipeline.

    3. Delete a bus.

Alternatively, you can delete your Google Cloud project to avoid incurring charges. Deleting your Google Cloud project stops billing for all the resources used within that project.

Delete a Google Cloud project:

gcloud projects delete PROJECT_ID

What's next