Gemini Code Assist is only supported in
VS Code with Gemini Code Assist + Cloud Code extension version+.
This page lists known issues for Eventarc Standard.
You can also check for existing issues or open new issues in the
public issue trackers.
Newly created triggers can take up to two minutes to become operational.
If you update a trigger
before its generated event is delivered,the event is routed according to the previous filtering and delivered to the original
destination within three days of the event generation. The new filtering is applied to events
generated after your update.
There is known duplicate transmission of Cloud Audit Logs from some
Google Cloud event sources. When duplicate logs are published, duplicate
events are delivered to destinations. To avoid these duplicate events, you
should create triggers for fields that ensure the event is unique.
This applies to the following event types:
Note that since Workflows handles event deduplication, you don't
have to ensure that the event is unique when you create a trigger for
Workflows.
Cross-project triggers are not yet supported. The service that receives
the events for the trigger must be in the same Google Cloud project
as the trigger. If requests to your service are triggered by messages published
to a Pub/Sub topic, the topic must also be in the same project as the
trigger. See
Route events across Google Cloud projects.
Regardless of where the virtual machine instance is actually located,
Cloud Audit Logs triggers for Compute Engine
result in events that originate from a single region: us-central1. When
creating your trigger,
ensure that the trigger location is set to either us-central1 or global.
For some event providers, you can choose to encode the event payload as
application/json or application/protobuf. However,
an event payload formatted in JSON is larger than one formatted in Protobuf,
and this might impact reliability depending on your event destination, and its
limits on event size. When this limit is reached, the event is retried
according to the retry characteristics of Eventarc's transport
layer, Pub/Sub.
Learn how to
handle Pub/Sub message failures
if the maximum number of retries is made.
While using Workflows as a destination for an
Eventarc trigger, events larger than the maximum
Workflows arguments size will fail to trigger workflow
executions. For more information, see
Quotas and limits.
The maximum nested depth limit on each structured log entry for triggers
that use Cloud Audit Logs is 64 levels. Log events that exceed this
limit are dropped and not delivered by Eventarc.
When creating an Eventarc trigger for the first time in a
Google Cloud project, there might be a delay in provisioning the
Eventarc service agent. This issue can usually be resolved by
attempting to create the trigger again. For more information, see
Permission denied errors.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-29 UTC."],[[["\u003cp\u003eNewly created triggers may take up to two minutes to become fully operational.\u003c/p\u003e\n"],["\u003cp\u003eDuplicate transmission of Cloud Audit Logs from certain Google Cloud sources can occur, resulting in duplicate events, so you should create triggers for fields ensuring event uniqueness.\u003c/p\u003e\n"],["\u003cp\u003eCross-project triggers are currently not supported, so the event-receiving service and the trigger must reside within the same Google Cloud project.\u003c/p\u003e\n"],["\u003cp\u003eEvents formatted in JSON are larger than those in Protobuf, potentially impacting reliability due to event size limits at the destination, and log events exceeding 64 levels are dropped when using Cloud Audit logs.\u003c/p\u003e\n"],["\u003cp\u003eProvisioning the Eventarc service agent can sometimes be delayed when creating an Eventarc trigger for the first time, which usually can be solved by reattempting the creation.\u003c/p\u003e\n"]]],[],null,["# Known issues for Eventarc Standard\n\n[Standard](/eventarc/standard/docs/overview)\n\nGemini Code Assist is only supported in\nVS Code with Gemini Code Assist + Cloud Code extension \u003cvar translate=\"no\"\u003eversion\u003c/var\u003e+.\n\nThis page lists known issues for Eventarc Standard.\n\nYou can also check for existing issues or open new issues in the\n[public issue trackers](/support/docs/issue-trackers).\n\n- **Newly created triggers can take up to two minutes to become operational.**\n\n- **If you [update a trigger](/eventarc/docs/managing-triggers#trigger-update)\n before its generated event is delivered,**\n\n the event is routed according to the previous filtering and delivered to the original destination within three days of the event generation. The new filtering is applied to events generated *after* your update.\n\n- There is known **duplicate transmission of Cloud Audit Logs from some\n Google Cloud event sources**. When duplicate logs are published, duplicate\n events are delivered to destinations. To avoid these duplicate events, you\n should create triggers for fields that ensure the event is unique.\n This applies to the following event types:\n\n - Cloud Storage (serviceName: `storage.googleapis.com`), methodName: `storage.buckets.list`\n - Compute Engine (serviceName: `compute.googleapis.com`), methodName: `beta.compute.instances.insert`\n - BigQuery (serviceName: `bigquery.googleapis.com`)\n\n Note that since Workflows handles event deduplication, you don't\n have to ensure that the event is unique when you create a trigger for\n Workflows.\n- **Cross-project triggers are not yet supported.** The service that receives\n the events for the trigger must be in the same Google Cloud project\n as the trigger. If requests to your service are triggered by messages published\n to a Pub/Sub topic, the topic must also be in the same project as the\n trigger. See\n [Route events across Google Cloud projects](/eventarc/docs/cross-project-triggers).\n\n- Regardless of where the virtual machine instance is actually located,\n **Cloud Audit Logs triggers for [Compute Engine](/eventarc/docs/reference/supported-events#compute-engine)\n result in events that originate from a single region** : `us-central1`. When\n [creating your trigger](/eventarc/standard/docs/event-providers-targets#triggers),\n ensure that the trigger location is set to either `us-central1` or `global`.\n\n- **[Direct Pub/Sub events](/eventarc/standard/docs/event-types#cloud-pubsub)\n don't include a\n [`delivery_attempt`](https://github.com/googleapis/google-cloudevents/blob/main/proto/google/events/cloud/pubsub/v1/data.proto#L45)\n field** unless the event destination is Cloud Run or\n Cloud Run functions. This might impact your\n [handling of message failures](/pubsub/docs/handling-failures).\n\n- For some event providers, you can choose to encode the event payload as\n `application/json` or `application/protobuf`. However,\n **an event payload formatted in JSON is larger than one formatted in Protobuf** ,\n and this might impact reliability depending on your event destination, and its\n limits on event size. When this limit is reached, the event is retried\n according to the [retry characteristics of Eventarc's transport\n layer, Pub/Sub](/eventarc/standard/docs/overview#event-retry-policy).\n Learn how to\n [handle Pub/Sub message failures](/pubsub/docs/handling-failures)\n if the maximum number of retries is made.\n\n- While using Workflows as a destination for an\n Eventarc trigger, **events larger than the maximum\n Workflows arguments size will fail to trigger workflow\n executions** . For more information, see\n [Quotas and limits](/workflows/quotas#resource_limit).\n\n- The maximum nested depth limit on each structured log entry for triggers\n that use Cloud Audit Logs is 64 levels. **Log events that exceed this\n limit are dropped** and not delivered by Eventarc.\n\n- When creating an Eventarc trigger for the first time in a\n Google Cloud project, there might be **a delay in provisioning the\n Eventarc service agent** . This issue can usually be resolved by\n attempting to create the trigger again. For more information, see\n [Permission denied errors](/eventarc/docs/troubleshooting#trigger-error)."]]