Trigger functions from Cloud Storage using Eventarc


This tutorial shows you how to deploy an event-driven function in Cloud Run, and use Eventarc to trigger the function in response to Cloud Storage events using the Google Cloud CLI.

By specifying filters for an Eventarc trigger, you can configure the routing of events, including the event source and the event target. For the example in this tutorial, an update to a Cloud Storage bucket triggers the event, and a request is sent to your function in the form of an of HTTP request.

Objectives

In this tutorial, you will:

  1. Create a Cloud Storage bucket to be the event source.
  2. Deploy an event-driven function.
  3. Create an Eventarc trigger.
  4. Trigger the function by uploading a file to Cloud Storage.

Costs

In this document, you use the following billable components of Google Cloud:

To generate a cost estimate based on your projected usage, use the pricing calculator. New Google Cloud users might be eligible for a free trial.

Before you begin

Security constraints defined by your organization might prevent you from completing the following steps. For troubleshooting information, see Develop applications in a constrained Google Cloud environment.

  1. Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
  2. Install the Google Cloud CLI.
  3. To initialize the gcloud CLI, run the following command:

    gcloud init
  4. Create or select a Google Cloud project.

    • Create a Google Cloud project:

      gcloud projects create PROJECT_ID

      Replace PROJECT_ID with a name for the Google Cloud project you are creating.

    • Select the Google Cloud project that you created:

      gcloud config set project PROJECT_ID

      Replace PROJECT_ID with your Google Cloud project name.

  5. Make sure that billing is enabled for your Google Cloud project.

  6. Install the Google Cloud CLI.
  7. To initialize the gcloud CLI, run the following command:

    gcloud init
  8. Create or select a Google Cloud project.

    • Create a Google Cloud project:

      gcloud projects create PROJECT_ID

      Replace PROJECT_ID with a name for the Google Cloud project you are creating.

    • Select the Google Cloud project that you created:

      gcloud config set project PROJECT_ID

      Replace PROJECT_ID with your Google Cloud project name.

  9. Make sure that billing is enabled for your Google Cloud project.

  10. If you are not using Cloud Shell, update the Google Cloud CLI components and log in using your account:
    gcloud components update
    gcloud auth login
  11. Enable the APIs:
    gcloud services enable artifactregistry.googleapis.com \
        cloudbuild.googleapis.com \
        eventarc.googleapis.com \
        run.googleapis.com \
        storage.googleapis.com
  12. Set the configuration variables to use in this tutorial:
    export REGION=us-central1
    gcloud config set run/region ${REGION}
    gcloud config set run/platform managed
    gcloud config set eventarc/location ${REGION}
  13. If you are under a domain restriction organization policy restricting unauthenticated invocations for your project, you will need to access your deployed service as described under Testing private services.

Set required roles

  1. If you are the project creator, you are granted the basic Owner role (roles/owner). By default, this Identity and Access Management (IAM) role includes the permissions necessary for full access to most Google Cloud resources and you can skip this step.

    If you are not the project creator, required permissions must be granted on the project to the appropriate principal. For example, a principal can be a Google Account (for end users) or a service account (for applications and compute workloads). For more information, see the Roles and permissions page for your event destination.

    Required permissions

    To get the permissions that you need to complete this tutorial, ask your administrator to grant you the following IAM roles on your project:

    For more information about granting roles, see Manage access to projects, folders, and organizations.

    You might also be able to get the required permissions through custom roles or other predefined roles.

  2. Make note of the Compute Engine default service account as you will you attach it to an Eventarc trigger to represent the identity of the trigger for testing purposes. This service account is automatically created after enabling or using a Google Cloud service that uses Compute Engine, and with the following email format:

    PROJECT_NUMBER-compute@developer.gserviceaccount.com

    Replace PROJECT_NUMBER with your Google Cloud project number. You can find your project number on the Welcome page of the Google Cloud console or by running the following command:

    gcloud projects describe PROJECT_ID --format='value(projectNumber)'

    For production environments, we strongly recommend creating a new service account and granting it one or more IAM roles that contain the minimum permissions required and follow the principle of least privilege.

  3. By default, Cloud Run services are only callable by Project Owners, Project Editors, and Cloud Run Admins and Invokers. You can control access on a per-service basis; however, for testing purposes, grant the Cloud Run Invoker role (run.invoker) on the Google Cloud project to the Compute Engine service account. This grants the role on all Cloud Run services and jobs in a project.
    gcloud projects add-iam-policy-binding PROJECT_ID \
        --member=serviceAccount:PROJECT_NUMBER-compute@developer.gserviceaccount.com \
        --role=roles/run.invoker

    Note that if you create a trigger for an authenticated Cloud Run service without granting the Cloud Run Invoker role, the trigger is created successfully and is active. However, the trigger will not work as expected and a message similar to the following appears in the logs:

    The request was not authenticated. Either allow unauthenticated invocations or set the proper Authorization header.
  4. Grant the Eventarc Event Receiver role (roles/eventarc.eventReceiver) on the project to the Compute Engine default service account so that the Eventarc trigger can receive events from event providers.
    gcloud projects add-iam-policy-binding PROJECT_ID \
        --member=serviceAccount:PROJECT_NUMBER-compute@developer.gserviceaccount.com \
        --role=roles/eventarc.eventReceiver
  5. Before creating a trigger for direct events from Cloud Storage, grant the Pub/Sub Publisher role (roles/pubsub.publisher) to the Cloud Storage service agent:

    SERVICE_ACCOUNT="$(gcloud storage service-agent --project=PROJECT_ID)"
    
    gcloud projects add-iam-policy-binding PROJECT_ID \
        --member="serviceAccount:${SERVICE_ACCOUNT}" \
        --role='roles/pubsub.publisher'
  6. If you enabled the Cloud Pub/Sub service agent on or before April 8, 2021, to support authenticated Pub/Sub push requests, grant the Service Account Token Creator role (roles/iam.serviceAccountTokenCreator) to the service agent. Otherwise, this role is granted by default:
    gcloud projects add-iam-policy-binding PROJECT_ID \
        --member=serviceAccount:service-PROJECT_NUMBER@gcp-sa-pubsub.iam.gserviceaccount.com \
        --role=roles/iam.serviceAccountTokenCreator

Create a Cloud Storage bucket

Create a Cloud Storage bucket to use as the event source:

gcloud storage buckets create -l us-central1 gs://PROJECT_ID-bucket/

Write an event-driven function

To write an event-driven function, follow these steps:

Node.js

  1. Create a new directory named helloGCS and change directory into it:

       mkdir helloGCS
       cd helloGCS
    

  2. Create a package.json file in the helloGCS directory to specify Node.js dependencies:

    {
      "name": "nodejs-docs-samples-functions-v2-storage",
      "version": "0.0.1",
      "private": true,
      "license": "Apache-2.0",
      "author": "Google LLC",
      "repository": {
        "type": "git",
        "url": "https://github.com/GoogleCloudPlatform/nodejs-docs-samples.git"
      },
      "engines": {
        "node": ">=16.0.0"
      },
      "scripts": {
        "test": "c8 mocha -p -j 2 test/*.test.js --timeout=60000"
      },
      "dependencies": {
        "@google-cloud/functions-framework": "^3.0.0"
      },
      "devDependencies": {
        "c8": "^10.0.0",
        "mocha": "^10.0.0",
        "sinon": "^18.0.0",
        "supertest": "^7.0.0"
      }
    }
    
  3. Create an index.js file in the helloGCS directory with the following Node.js sample:

    const functions = require('@google-cloud/functions-framework');
    
    // Register a CloudEvent callback with the Functions Framework that will
    // be triggered by Cloud Storage.
    functions.cloudEvent('helloGCS', cloudEvent => {
      console.log(`Event ID: ${cloudEvent.id}`);
      console.log(`Event Type: ${cloudEvent.type}`);
    
      const file = cloudEvent.data;
      console.log(`Bucket: ${file.bucket}`);
      console.log(`File: ${file.name}`);
      console.log(`Metageneration: ${file.metageneration}`);
      console.log(`Created: ${file.timeCreated}`);
      console.log(`Updated: ${file.updated}`);
    });

Python

  1. Create a new directory named helloGCS and change directory into it:

       mkdir helloGCS
       cd helloGCS
    

  2. Create a requirements.txt file in the helloGCS directory, to specify Python dependencies:

    functions-framework==3.5.0
    cloudevents==1.11.0

    This adds packages needed by the sample.

  3. Create a main.py file in the helloGCS directory with the following Python sample:

    from cloudevents.http import CloudEvent
    
    import functions_framework
    
    
    # Triggered by a change in a storage bucket
    @functions_framework.cloud_event
    def hello_gcs(cloud_event: CloudEvent) -> tuple:
        """This function is triggered by a change in a storage bucket.
    
        Args:
            cloud_event: The CloudEvent that triggered this function.
        Returns:
            The event ID, event type, bucket, name, metageneration, and timeCreated.
        """
        data = cloud_event.data
    
        event_id = cloud_event["id"]
        event_type = cloud_event["type"]
    
        bucket = data["bucket"]
        name = data["name"]
        metageneration = data["metageneration"]
        timeCreated = data["timeCreated"]
        updated = data["updated"]
    
        print(f"Event ID: {event_id}")
        print(f"Event type: {event_type}")
        print(f"Bucket: {bucket}")
        print(f"File: {name}")
        print(f"Metageneration: {metageneration}")
        print(f"Created: {timeCreated}")
        print(f"Updated: {updated}")
    
        return event_id, event_type, bucket, name, metageneration, timeCreated, updated
    
    

Deploy an event-driven function

Deploy the function named helloworld-events by running the following command in the directory that contains the sample code:

Node.js

gcloud beta run deploy helloworld-events \
      --source . \
      --function helloGCS \
      --base-image nodejs20 \
      --region us-central1 \

Python

gcloud beta run deploy helloworld-events \
      --source . \
      --function hello_gcs \
      --base-image python312 \
      --region us-central1 \

When the deployment is complete, the Google Cloud CLI displays a URL where the helloworld-events service is running.

Create an Eventarc trigger

The Eventarc trigger sends events from the Cloud Storage bucket to your helloworld-events Cloud Run service.

  1. Create a trigger that filters Cloud Storage events:

    gcloud eventarc triggers create TRIGGER_NAME  \
        --location=${REGION} \
        --destination-run-service=helloworld-events  \
        --destination-run-region=${REGION} \
        --event-filters="type=google.cloud.storage.object.v1.finalized" \
        --event-filters="bucket=PROJECT_ID-bucket" \
        --service-account=PROJECT_NUMBER-compute@developer.gserviceaccount.com

    Replace:

    • TRIGGER_NAME with the name for your trigger.
    • PROJECT_ID with your Google Cloud project ID.
    • PROJECT_NUMBER with your Google Cloud project number.

    Note that when creating an Eventarc trigger for the first time in a Google Cloud project, there might be a delay in provisioning the Eventarc service agent. This issue can usually be resolved by attempting to create the trigger again. For more information, see Permission denied errors.

  2. Confirm that the trigger was successfully created. Note that although your trigger is created immediately, it can take up to two minutes for a trigger to be fully functional.

    gcloud eventarc triggers list --location=${REGION}

    The output should be similar to the following:

    NAME: helloworld-events
    TYPE: google.cloud.storage.object.v1.finalized
    DESTINATION: Cloud Run service: helloworld-events
    ACTIVE: Yes
    LOCATION: us-central1
    

Generate and view an event

Upload a text file to the Cloud Storage bucket to generate an event which is routed to the function. The Cloud Run function logs the event in the service logs.

  1. Upload a text file to Cloud Storage, to generate an event:

     echo "Hello World" > random.txt
     gcloud storage cp random.txt gs://PROJECT_ID-bucket/random.txt
    

    The upload generates an event and the Cloud Run function logs the event's message.

  2. To view the log entry:

    1. Filter the log entries and return the output in JSON format:

      gcloud logging read "resource.labels.service_name=helloworld-events AND textPayload:random.txt" --format=json
      
    2. Look for a log entry similar to:

      [
        {
         ....
          "resource": {
            "labels": {
              ....
              "location": "us-central1",
              .....
              "service_name": "helloworld-events"
            },
          },
          "textPayload": "File: random.txt",
           .....
        }
      ]
      

      Logs might take a few moments to appear. If you don't see them immediately, check again after a minute.

After you see the log entry, this confirms that you have successfully deployed an event-driven function that was triggered when a text file was uploaded to Cloud Storage.

Clean up

If you created a new project for this tutorial, delete the project. If you used an existing project and want to keep it without the changes added in this tutorial, delete the resources created for the tutorial.

Delete the project

The easiest way to eliminate billing is to delete the project that you created for the tutorial.

To delete the project:

  1. In the Google Cloud console, go to the Manage resources page.

    Go to Manage resources

  2. In the project list, select the project that you want to delete, and then click Delete.
  3. In the dialog, type the project ID, and then click Shut down to delete the project.

Delete tutorial resources

  1. Delete the Cloud Run service you deployed in this tutorial:

    gcloud run services delete SERVICE_NAME

    Where SERVICE_NAME is your chosen service name.

    You can also delete Cloud Run services from the Google Cloud console.

  2. Remove any gcloud CLI default configurations you added during the tutorial setup.

    For example:

    gcloud config unset run/region

    or

    gcloud config unset project

  3. Delete other Google Cloud resources created in this tutorial:

    • Delete the Eventarc trigger:
      gcloud eventarc triggers delete TRIGGER_NAME
      
      Replace TRIGGER_NAME with the name of your trigger.

What's next