Publish events to a BigQuery table
This quickstart shows you how to publish and receive event messages by creating an Eventarc Advanced bus and enrollment in your Google Cloud project.
A bus lets you centralize the flow of messages through your system, and acts as a router. It receives event messages from a message source or published by a provider, and evaluates them according to an enrollment.
An enrollment identifies a subscription to a particular bus, and defines the matching criteria for messages, causing them to be routed accordingly to one or more destinations.
In this quickstart, you:
Create a BigQuery table.
Create an Eventarc Advanced bus.
Create an Eventarc Advanced enrollment.
Publish an event message to the bus.
View the event data in the BigQuery table.
You can complete this quickstart using the gcloud CLI and the bq command-line tool.
Before you begin
Security constraints defined by your organization might prevent you from completing the following steps. For troubleshooting information, see Develop applications in a constrained Google Cloud environment.
- Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
-
Install the Google Cloud CLI.
-
If you're using an external identity provider (IdP), you must first sign in to the gcloud CLI with your federated identity.
-
To initialize the gcloud CLI, run the following command:
gcloud init
-
Create or select a Google Cloud project.
-
Create a Google Cloud project:
gcloud projects create PROJECT_ID
Replace
PROJECT_ID
with a name for the Google Cloud project you are creating. -
Select the Google Cloud project that you created:
gcloud config set project PROJECT_ID
Replace
PROJECT_ID
with your Google Cloud project name.
-
-
Verify that billing is enabled for your Google Cloud project.
-
Enable the BigQuery and Eventarc APIs:
gcloud services enable bigquery.googleapis.com
eventarc.googleapis.com eventarcpublishing.googleapis.com -
Install the Google Cloud CLI.
-
If you're using an external identity provider (IdP), you must first sign in to the gcloud CLI with your federated identity.
-
To initialize the gcloud CLI, run the following command:
gcloud init
-
Create or select a Google Cloud project.
-
Create a Google Cloud project:
gcloud projects create PROJECT_ID
Replace
PROJECT_ID
with a name for the Google Cloud project you are creating. -
Select the Google Cloud project that you created:
gcloud config set project PROJECT_ID
Replace
PROJECT_ID
with your Google Cloud project name.
-
-
Verify that billing is enabled for your Google Cloud project.
-
Enable the BigQuery and Eventarc APIs:
gcloud services enable bigquery.googleapis.com
eventarc.googleapis.com eventarcpublishing.googleapis.com - Update
gcloud
components:gcloud components update
- Sign in using your account:
gcloud auth login
- Set the configuration variable used in this quickstart:
REGION=REGION
Replace
REGION
with a supported location for the bus—for example,us-central1
. -
If you are the project creator, you are granted the basic Owner role (
roles/owner
). By default, this Identity and Access Management (IAM) role includes the permissions necessary for full access to most Google Cloud resources and you can skip this step.If you are not the project creator, required permissions must be granted on the project to the appropriate principal. For example, a principal can be a Google Account (for end users) or a service account (for applications and compute workloads).
Required permissions
To get the permissions that you need to complete this quickstart, ask your administrator to grant you the following IAM roles on your project:
-
BigQuery Data Editor (
roles/bigquery.dataEditor
) -
Eventarc Developer (
roles/eventarc.developer
) -
Eventarc Message Bus Admin (
roles/eventarc.messageBusAdmin
) -
Logs View Accessor (
roles/logging.viewAccessor
) -
Project IAM Admin (
roles/resourcemanager.projectIamAdmin
) -
Service Account Admin (
roles/iam.serviceAccountAdmin
) -
Service Account User (
roles/iam.serviceAccountUser
) -
Service Usage Admin (
roles/serviceusage.serviceUsageAdmin
)
For more information about granting roles, see Manage access to projects, folders, and organizations.
You might also be able to get the required permissions through custom roles or other predefined roles.
-
BigQuery Data Editor (
- To give Eventarc Advanced the necessary permissions to update
BigQuery table properties, ask your administrator to grant the
BigQuery
Data Editor (
roles/bigquery.dataEditor
) IAM role on your Google Cloud project to a service account:- Create a service account. For testing purposes, you will attach this service
account to an Eventarc Advanced pipeline to represent the
identity of the pipeline.
Replacegcloud iam service-accounts create SERVICE_ACCOUNT_NAME
SERVICE_ACCOUNT_NAME
with a name for your service account. - Grant the
roles/bigquery.dataEditor
IAM role to the service account:gcloud projects add-iam-policy-binding PROJECT_ID \ --member="serviceAccount:SERVICE_ACCOUNT_NAME@PROJECT_ID.iam.gserviceaccount.com" \ --role=roles/bigquery.dataEditor
- Create a service account. For testing purposes, you will attach this service
account to an Eventarc Advanced pipeline to represent the
identity of the pipeline.
Create a BigQuery table
Create a BigQuery table as your event destination. Other event destinations are supported such as a Pub/Sub topic, Workflows, or another HTTP endpoint. For more information, see Event providers and destinations.
Before creating a BigQuery table, create a dataset which acts as a top-level container for the table, and a table schema.
To create a new dataset, use the bq mk command with the
--dataset
flag.bq --location=$REGION mk --dataset DATASET_ID
Replace
DATASET_ID
with a unique name for the BigQuery dataset—for example,my_dataset
.In your terminal, create a new file called
my-schema.json
.Copy and paste the following schema into the new file, and then save the file.
[ { "name": "name", "type": "STRING", "mode": "REQUIRED" }, { "name": "age", "type": "INTEGER", "mode": "NULLABLE" } ]
To create a table, use the bq mk command with the
--table
flag.bq mk --table PROJECT_ID:DATASET_ID.TABLE_ID my-schema.json
Replace
TABLE_ID
with a unique name for the BigQuery table—for example,my-table
.
Create an Eventarc Advanced bus
A bus receives event messages from a message source or published by a provider and acts as a message router.
For more information, see Create a bus to route messages.
Create an Eventarc Advanced bus in your project by using the
gcloud eventarc message-buses create
command:
gcloud eventarc message-buses create BUS_NAME \ --location=$REGION
Replace BUS_NAME
with the ID of your bus or a fully
qualified name—for example, my-bus
.
Create an Eventarc Advanced enrollment
An enrollment determines which messages are routed to a destination and it also specifies the pipeline that is used to configure a destination for the event messages. In this case, the target destination is a BigQuery API endpoint.
For more information, see Create an enrollment to receive events.
When using the gcloud CLI, you first create a pipeline, and then create an enrollment:
Create a pipeline by using the
gcloud eventarc pipelines create
command:gcloud eventarc pipelines create PIPELINE_NAME \ --destinations=http_endpoint_uri='https://bigquery.googleapis.com/bigquery/v2/projects/PROJECT_ID/datasets/DATASET_ID/tables/TABLE_ID/insertAll',http_endpoint_message_binding_template='{"headers": headers.merge({"content-type":"application/json"}), "body": {"rows":[{"json":message.data}]}}',oauth_token_authentication_service_account=SERVICE_ACCOUNT_NAME@PROJECT_ID.iam.gserviceaccount.com \ --input-payload-format-json= \ --location=$REGION
Replace the
PIPELINE_NAME
with the ID of the pipeline or a fully qualified name—for example,my-pipeline
.Note the following:
- The
http_endpoint_message_binding_template
key transforms the event into the format expected by the API. When defining a message binding, you must configure an input format to access the payload. - The
oauth_token_authentication_service_account
key specifies a service account email. This email is used to generate an OAuth token which should generally be used only when calling Google APIs hosted on*.googleapis.com
. - The
input-payload-format-json
flag specifies that the pipeline's input payload format is JSON; any messages not matching this format are treated as persistent errors.
- The
Create an enrollment by using the
gcloud eventarc enrollments create
command:gcloud eventarc enrollments create ENROLLMENT_NAME \ --cel-match=MATCH_EXPRESSION \ --destination-pipeline=PIPELINE_NAME \ --message-bus=BUS_NAME \ --message-bus-project=PROJECT_ID \ --location=$REGION
Replace the following:
ENROLLMENT_NAME
: the ID of the enrollment or a fully qualified name—for example,my-enrollment
.MATCH_EXPRESSION
: the matching expression for this enrollment using CEL—for example:"message.type == 'hello-world-type'"
Publish an event message to the bus
To directly publish a message to your bus, you can use the
gcloud eventarc message-buses publish
command or send a request to the
Eventarc Publishing REST API.
For more information, see
Publish events directly.
The message must be in a CloudEvents format which is a specification for
describing event data in a common way. The data
element is the payload of your
event and it must ultimately match the schema of your BigQuery
table. Any well-formed JSON can go in this field. For more information about
CloudEvents context attributes, see
Event format.
The following are examples of directly publishing an event to an Eventarc Advanced bus:
Example 1
You can publish an event to a bus using the gcloud CLI and an
--event-data
and other event attribute flags:
gcloud eventarc message-buses publish BUS_NAME \
--event-data='{"name": "my-name", "age": "20"}' \
--event-id=hello-world-id-1234 \
--event-source=hello-world-source \
--event-type=hello-world-type \
--event-attributes="datacontenttype=application/json" \
--location=$REGION
Example 2
You can publish an event to a bus as a JSON message using the gcloud CLI
and a --json-message
flag:
gcloud eventarc message-buses publish BUS_NAME \
--location=$REGION \
--json-message='{"id": "hello-world-id-1234", "type":
"hello-world-type", "source":
"hello-world-source", "specversion": "1.0", "data":
{"name": "my-name", "age": "20"}}'
After publishing an event, you should receive an "Event published successfully" message.
View the event data in the BigQuery table
After publishing an event to your Eventarc Advanced bus, you can use the bq query command to confirm that a new row was added to your BigQuery table.
bq query \ --use_legacy_sql=false \ 'SELECT * FROM `PROJECT_ID.DATASET_ID.TABLE_ID` LIMIT 10;'
You have successfully created an Eventarc Advanced bus and enrollment, published an event message to the bus, and verified the expected outcome by querying the BigQuery table.
Clean up
When you finish the tasks that are described in this quickstart, you can avoid continued billing by deleting the resources that you created:
Delete Eventarc Advanced resources:
Alternatively, you can delete your Google Cloud project to avoid incurring charges. Deleting your Google Cloud project stops billing for all the resources used within that project.
Delete a Google Cloud project:
gcloud projects delete PROJECT_ID