Extend Cloud Run with event triggers using Cloud Run functions
With Cloud Run functions, you can deploy code to handle events triggered by changes in your Cloud Run database. This lets you to add server-side functionality without running your own servers.
This guide describes how to create triggers for Cloud Run functions from Firestore events.
You can trigger your Cloud Run functions from events in a Firestore database. When triggered, your function reads and updates a Firestore database in response to these events through the Firestore APIs and client libraries.
The process of Firestore events triggering a Cloud Run function consists of the following steps:
The service waits for changes to a particular document.
When a change occurs, the service is triggered and performs its tasks.
The service receives a data object with a snapshot of the affected document. For
write
orupdate
events, the data object contains snapshots representing document state before and after the triggering event.
Before you begin
- Make sure you have set up a new project for Cloud Run as described in the setup page.
Enable the Artifact Registry, Cloud Build, Cloud Run Admin API, Eventarc, Firestore Cloud Logging, and Pub/Sub APIs:
Required roles
You or your administrator must grant the deployer account and the trigger identity. Optionally, grant the Pub/Sub service agent the following IAM roles.
Required roles for the deployer account
To get the permissions that you need to trigger from Firestore events, ask your administrator to grant you the following IAM roles on your project:
-
Cloud Build Editor (
roles/cloudbuild.builds.editor
) -
Cloud Run Admin (
roles/run.admin
) -
Datastore Owner (
roles/datastore.owner
) -
Eventarc Admin (
roles/eventarc.admin
) -
Logs View Accessor (
roles/logging.viewAccessor
) -
Project IAM Admin (
roles/resourcemanager.projectIamAdmin
) -
Service Account Admin (
roles/iam.serviceAccountAdmin
) -
Service Account User (
roles/iam.serviceAccountUser
) -
Service Usage Admin (
roles/serviceusage.serviceUsageAdmin
)
For more information about granting roles, see Manage access to projects, folders, and organizations.
You might also be able to get the required permissions through custom roles or other predefined roles.
Note that by default, Cloud Build permissions include permissions to upload and download Artifact Registry artifacts.
Required roles for the trigger identity
Make note of the Compute Engine default service account as you will you attach it to an Eventarc trigger to represent the identity of the trigger for testing purposes. This service account is automatically created after enabling or using a Google Cloud service that uses Compute Engine, and with the following email format:
PROJECT_NUMBER-compute@developer.gserviceaccount.com
Replace
PROJECT_NUMBER
with your Google Cloud project number. You can find your project number on the Welcome page of the Google Cloud console or by running the following command:gcloud projects describe PROJECT_ID --format='value(projectNumber)'
For production environments, we strongly recommend creating a new service account and granting it one or more IAM roles that contain the minimum permissions required and follow the principle of least privilege.
- By default, Cloud Run services are only callable by Project
Owners, Project Editors, and Cloud Run Admins and Invokers.
You can
control
access on a per-service basis; however, for testing purposes, grant the
Cloud Run
Invoker role (
run.invoker
) on the Google Cloud project to the Compute Engine service account. This grants the role on all Cloud Run services and jobs in a project.gcloud projects add-iam-policy-binding PROJECT_ID \ --member=serviceAccount:PROJECT_NUMBER-compute@developer.gserviceaccount.com \ --role=roles/run.invoker
Note that if you create a trigger for an authenticated Cloud Run service without granting the Cloud Run Invoker role, the trigger is created successfully and is active. However, the trigger will not work as expected and a message similar to the following appears in the logs:
The request was not authenticated. Either allow unauthenticated invocations or set the proper Authorization header.
- Grant the
Eventarc
Event Receiver role (
roles/eventarc.eventReceiver
) on the project to the Compute Engine default service account so that the Eventarc trigger can receive events from event providers.gcloud projects add-iam-policy-binding PROJECT_ID \ --member=serviceAccount:PROJECT_NUMBER-compute@developer.gserviceaccount.com \ --role=roles/eventarc.eventReceiver
Optional role for the Pub/Sub service agent
- If you enabled the Cloud Pub/Sub service agent on or before April
8, 2021, to support authenticated Pub/Sub push requests, grant
the Service
Account Token Creator role (
roles/iam.serviceAccountTokenCreator
) to the service agent. Otherwise, this role is granted by default:gcloud projects add-iam-policy-binding PROJECT_ID \ --member=serviceAccount:service-PROJECT_NUMBER@gcp-sa-pubsub.iam.gserviceaccount.com \ --role=roles/iam.serviceAccountTokenCreator
Set up your Firestore database
Before you deploy your service, you must create a Firestore database:
Go to the Firestore page.
Select Create a Firestore database.
In the Name your database field, enter a Database ID, such as
firestore-db
.In the Configuration options section, Firestore native is selected by default along with the applicable security rules.
In Location type, select Region and choose the region for where your database is to reside. This choice is permanent.
Click Create database.
The Firestore data model consists of collections that contain documents. A document contains a set of key-value pairs.
Write a Firestore-triggered function
To write a function that responds to Firestore events, prepare to specify the following during deployment:
- a trigger event type
- a trigger event filter to select the documents associated with the function
- the function code to run
Event types
Firestore supports create
, update
, delete
, and write
events. The
write
event encompasses all modifications to a document.
Event type | Trigger |
---|---|
google.cloud.firestore.document.v1.created (default) |
Triggered when a document is written to for the first time. |
google.cloud.firestore.document.v1.updated |
Triggered when a document already exists and has any value changed. |
google.cloud.firestore.document.v1.deleted |
Triggered when a document with data is deleted. |
google.cloud.firestore.document.v1.written |
Triggered when a document is created, updated or deleted. |
google.cloud.firestore.document.v1.created.withAuthContext |
Same as created but adds authentication information. |
google.cloud.firestore.document.v1.updated.withAuthContext |
Same as updated but adds authentication information. |
google.cloud.firestore.document.v1.deleted.withAuthContext |
Same as deleted but adds authentication information. |
google.cloud.firestore.document.v1.written.withAuthContext |
Same as written but adds authentication information. |
Wildcards are written in triggers using curly braces, for example:
projects/YOUR_PROJECT_ID/databases/(default)/documents/collection/{document_wildcard}
Trigger event filters
To trigger your service, specify a document path to listen to. The document path must be in the same Google Cloud project as the service.
Here are a few examples of valid document paths:
users/marie
: Monitors a single document,/users/marie
.users/{username}
: Monitors all user documents. Wildcards are used to monitor all documents in the collection.users/{username}/addresses/home
: Monitors the home address document for all users.users/{username}/addresses/{addressId}
: Monitors all address documents.users/{user=**}
: Monitors all user documents and any documents in subcollections under each user document such as/users/userID/address/home
or/users/userID/phone/work
.users/{username}/addresses
: invalid address path. Refers to the subcollectionaddresses
, not a document.
Wildcards and parameters
If you don't know the specific document you want to monitor, use a {wildcard}
instead of the document ID:
users/{username}
listens for changes to all user documents.
In this example, when any field on any document in users
is changed, it
matches a wildcard called {username}
.
If a document in users
has
subcollections, and a
field in one of those subcollections' documents is changed, the {username}
wildcard is not triggered. If your goal is to respond to events in
subcollections also, use the multi-segment wildcard {username=**}
.
Wildcard matches are extracted from document paths. You can define as many
wildcards as you like to substitute explicit collection or document IDs. You can
use up to one multi-segment wildcard like {username=**}
.
Function code
See examples for how to use Firestore in Native Mode events to trigger a Cloud Run function.
Include the proto dependencies in your source
You must include the Cloud Run data.proto
file in the source directory for your function. This file imports the following
protos which you must also include in your source directory:
Use the same directory structure for the dependencies. For example, place
struct.proto
within google/protobuf
.
These files are required to decode event data. If your function source does not include these files, it returns an error when it runs.
Event attributes
Each event includes data attributes that include information about the event such as the time the event triggered. Cloud Run adds additional data about the database and document involved in the event. You can access these attributes as follows:
Java
logger.info("Function triggered by event on: " + event.getSource()); logger.info("Event type: " + event.getType()); logger.info("Event time " + event.getTime()); logger.info("Event project: " + event.getExtension("project")); logger.info("Event location: " + event.getExtension("location")); logger.info("Database name: " + event.getExtension("database")); logger.info("Database document: " + event.getExtension("document")); // For withAuthContext events logger.info("Auth information: " + event.getExtension("authid")); logger.info("Auth information: " + event.getExtension("authtype"));
Node.js
console.log(`Function triggered by event on: ${cloudEvent.source}`); console.log(`Event type: ${cloudEvent.type}`); console.log(`Event time: ${cloudEvent.time}`); console.log(`Event project: ${cloudEvent.project}`); console.log(`Event location: ${cloudEvent.location}`); console.log(`Database name: ${cloudEvent.database}`); console.log(`Document name: ${cloudEvent.document}`); // For withAuthContext events console.log(`Auth information: ${cloudEvent.authid}`); console.log(`Auth information: ${cloudEvent.authtype}`);
Python
print(f"Function triggered by change to: {cloud_event['source']}") print(f"Event type: {cloud_event['type']}") print(f"Event time: {cloud_event['time']}") print(f"Event project: {cloud_event['project']}") print(f"Location: {cloud_event['location']}") print(f"Database name: {cloud_event['database']}") print(f"Document: {cloud_event['document']}") // For withAuthContext events print(f"Auth information: {cloud_event['authid']}") print(f"Auth information: {cloud_event['authtype']}")
Event structures
This trigger invokes your service with an event similar to:
{ "oldValue": { // Update and Delete operations only A Document object containing a pre-operation document snapshot }, "updateMask": { // Update operations only A DocumentMask object that lists changed fields. }, "value": { // A Document object containing a post-operation document snapshot } }
Each Document
object contains one or more Value
objects. See the
Value
documentation
for type references.
Create triggers for functions
Click the tab for instructions using the tool of your choice.
Console
When you use the Google Cloud console to create a function, you can also add a trigger to your function. Follow these steps to create a trigger for your function:
In the Google Cloud console, go to Cloud Run:
Click Write a function, and enter the function details. For more information about configuring functions during deployment, see Deploy functions.
In the Trigger section, click Add trigger.
Select Firestore trigger.
In the Eventarc trigger pane, modify the trigger details as follows:
Enter a name for the trigger in the Trigger name field, or use the default name.
Select a Trigger type from the list:
Google Sources to specify triggers for Pub/Sub, Cloud Storage, Firestore, and other Google event providers.
Third-party to integrate with non-Google providers that offer an Eventarc source. For more information, see Third-party events in Eventarc.
Select Firestore from the Event provider list, to select a product that provides the type of event for triggering your function. For the list of event providers, see Event providers and destinations.
Select type=google.cloud.firestore.document.v1.created from the Event type list. Your trigger configuration varies depending on the supported event type. For more information, see Event types.
In the Filters section, select a database, operation and attribute values, or use the default selections.
If the Region field is enabled, select a location for the Eventarc trigger. In general, the location of an Eventarc trigger should match the location of the Google Cloud resource that you want to monitor for events. In most scenarios, you should also deploy your function in the same region. See Understand Eventarc locations for more details about Eventarc trigger locations.
In the Service account field, select a service account. Eventarc triggers are linked to service accounts to use as an identity when invoking your function. Your Eventarc trigger's service account must have the permission to invoke your function. By default, Cloud Run uses the Compute Engine default service account.
Optionally, specify the Service URL path to send the incoming request to. This is the relative path on the destination service to which the events for the trigger should be sent. For example:
/
,/route
,route
, androute/subroute
.
Once you've completed the required fields, click Save trigger.
gcloud
When you create a function using the gcloud CLI, you must first deploy your function, and then create a trigger. Follow these steps to create a trigger for your function:
Run the following command in the directory that contains the sample code to deploy your function:
gcloud run deploy FUNCTION \ --source . \ --function FUNCTION_ENTRYPOINT \ --base-image BASE_IMAGE_ID \ --region REGION
Replace:
FUNCTION with the name of the function you are deploying. You can omit this parameter entirely, but you will be prompted for the name if you omit it.
FUNCTION_ENTRYPOINT with the entry point to your function in your source code. This is the code Cloud Run executes when your function runs. The value of this flag must be a function name or fully-qualified class name that exists in your source code.
BASE_IMAGE_ID with the base image environment for your function. For more details about base images and the packages included in each image, see Runtimes base images.
REGION with the Google Cloud region where you want to deploy your function. For example,
europe-west1
.
Run the following command to create a trigger that filters events:
gcloud eventarc triggers create TRIGGER_NAME \ --location=EVENTARC_TRIGGER_LOCATION \ --destination-run-service=FUNCTION \ --destination-run-region=REGION \ --event-filters="type=google.cloud.firestore.document.v1.created" \ --service-account=PROJECT_NUMBER-compute@developer.gserviceaccount.com
Replace:
TRIGGER_NAME with the name for your trigger.
EVENTARC_TRIGGER_LOCATION with the location for the Eventarc trigger. In general, the location of an Eventarc trigger should match the location of the Google Cloud resource that you want to monitor for events. In most scenarios, you should also deploy your function in the same region. For more information, see Eventarc locations.
FUNCTION with the name of the function you are deploying.
REGION with the Cloud Run region of the function.
PROJECT_NUMBER with your Google Cloud project number. Eventarc triggers are linked to service accounts to use as an identity when invoking your function. Your Eventarc trigger's service account must have the permission to invoke your function. By default, Cloud Run uses the Default compute service account.
Each
event-filters
flag specifies a type of event, with the function triggering only when an event meets all of the criteria specified in itsevent-filters
flags. Each trigger must have anevent-filters
flag specifying a supported event type, such as a new document written to Firestore or a file uploaded to Cloud Storage. You can't change the event filter type after creation. To change the event filter type, you must create a new trigger and delete the old one. Optionally, you can repeat the--event-filters
flag with a supported filter in the formATTRIBUTE=VALUE
to add more filters.
Terraform
To create an Eventarc trigger for a Cloud Run function, see Create a trigger using Terraform.
Examples
The following examples describe how to use Firestore in Native Mode events to trigger a Cloud Run function.
Example 1: Hello Firestore function
The following sample prints the fields of a triggering Firestore event:
Node.js
Python
Go
Java
C#
Deploy the function
To deploy the Hello Firestore
function, run the following command:
If you haven't already done so, set up your Firestore database.
To deploy the function, see Create triggers for functions.
Test the function
To test the Hello Firestore
function, set up a collection called
users
in your Firestore database:
In the Google Cloud console, go to the Firestore databases page:
Click Start a collection.
Specify
users
as the collection ID.To start adding the collection's first document, under Add its first document accept the auto-generated Document ID.
Add at least one field for the document, specifying a name and value. For example, in Field name, enter
username
, and in Field value, enterrowan
.When you're done, click Save.
This action creates a new document, thereby triggering your function.
To confirm that your function was triggered, click the linked name of the function in the Google Cloud console Cloud Run Overview page to open the Service details page.
Select the Logs tab and look for this string:
Function triggered by change to: //firestore.googleapis.com/projects/your-project-id/databases/(default)'
Example 2: Convert to Uppercase function
The following example retrieves the value added by the user, converts the string at that location to uppercase, and replaces the value with the uppercase string:
Node.js
Use protobufjs to decode the event
data. Include the google.events.cloud.firestore.v1
data.proto
in your source.
Python
Go
Java
C#
Deploy the function
To deploy the Convert to Uppercase
function, run the following command:
If you haven't already done so, set up your Firestore database.
To deploy the function, see Create triggers for functions.
Test the function
To test the Convert to Uppercase
function you just deployed, set up
a collection called messages
in your
Firestore database:
In the Google Cloud console, go to the Firestore databases page:
Click Start a collection.
Specify
messages
as the collection ID.To start adding the collection's first document, under Add its first document accept the auto-generated Document ID.
To trigger your deployed function, add a document where the Field name is
original
and the Field value isminka
.When you save the document, you can see the lowercase word in the value field convert to uppercase.
If you subsequently edit the field value to contain lowercase letters, that triggers the function again, converting all lowercase letters to uppercase.
Limitations for functions
- Ordering is not guaranteed. Rapid changes can trigger function invocations in an unexpected order.
- Events are delivered at least once, but a single event might result in multiple function invocations. Avoid depending on exactly-once mechanics, and write idempotent functions.
- A trigger is associated with a single database. You can't create a trigger that matches multiple databases.
- Deleting a database doesn't automatically delete any triggers for that database. The trigger stops delivering events but continues to exist until you delete the trigger.