The following table offers some guidance in choosing the appropriate delivery
mechanism for your application:
Features supported by Pub/Sub subscriptions
Use case
Pull subscription
Large volume of messages (GBs per second).
Efficiency and throughput of message processing is critical.
Environments where a public HTTPS endpoint with a
non-self-signed SSL certificate is not feasible to set up.
Push subscription
Multiple topics that must be processed by the same webhook.
App Engine Standard and Cloud Run functions subscribers.
Environments where Google Cloud dependencies (such as
credentials and the client library) are not feasible to set up.
Export subscription
Large volume of messages that can scale up to multiple
millions of messages per second.
Messages are directly sent to a Google Cloud resource
without any additional processing.
Endpoints
Pull subscription
Any device on the internet that has authorized credentials is able
to call the Pub/Sub API.
Push subscription
An HTTPS server with non-self-signed certificate accessible on
the public web.
The receiving endpoint might be decoupled from the
Pub/Sub subscription, so that messages from multiple
subscriptions are sent to a single endpoint.
Export subscription
A BigQuery dataset and table for a
BigQuery subscription.
A Cloud Storage bucket for a Cloud Storage
subscription.
Load balancing
Pull subscription
Multiple subscribers can make pull calls to the same
"shared" subscription.
Each subscriber receives a subset of messages.
Push subscription
Push endpoints can be load balancers.
Export subscription
The Pub/Sub service automatically balances the load.
Configuration
Pull subscription
No configuration is necessary.
Push subscription
No configuration is necessary for App Engine apps in
the same project as the subscriber.
Verification of push endpoints is not required in the
Google Cloud console.
Endpoints must be reachable using DNS names and have
SSL certificates installed.
Export subscription
A BigQuery dataset and table must
exist for the BigQuery
subscription, configured with the appropriate permissions.
A Cloud Storage bucket must exist for the
Cloud Storage subscription, configured
with the appropriate permissions.
Flow control
Pull subscription
The subscriber client controls the rate of delivery. The subscriber
can dynamically modify the acknowledgment deadline, allowing
message processing to be arbitrarily long.
Push subscription
The Pub/Sub server automatically implements flow
control. There's no need to handle message flow at the client side.
However, it's possible to indicate that the client cannot handle
the current message load by passing back an HTTP error.
Export subscription
The Pub/Sub server automatically implements flow
control to optimize writing messages to a Google Cloud resource.
Efficiency and throughput
Pull subscription
Achieves high throughput at low CPU and bandwidth by allowing
batched delivery, acknowledgments, and massively parallel
consumption. May be inefficient if aggressive polling is used to
minimize message delivery time.
Push subscription
Delivers one message per request and limits the maximum number of
outstanding messages.
Export subscription
Scalability is dynamically handled by Pub/Sub
servers.
When to use an export subscription
Without an export subscription, you need a pull or push
subscription and a subscriber (such as Dataflow) to
read messages and write them to a Google Cloud resource.
The overhead of running a Dataflow job is
not necessary when messages don't
require additional processing before being stored.
Export subscriptions have the following advantages:
Simple deployment. You can set up an export subscription through a
single workflow in the console, Google Cloud CLI, client library,
or Pub/Sub API.
Low costs. Reduces the additional cost and latency of similar
Pub/Sub pipelines that include Dataflow jobs.
This cost optimization is useful for messaging systems that don't require
additional processing before storage.
Minimal monitoring. Export subscriptions are part of the multi-tenant
Pub/Sub service and don't require you to run separate
monitoring jobs.
Flexibility. A BigQuery subscription can use the
schema of the topic to which it is attached, which is not available with
the basic Dataflow template for writing from
Pub/Sub to BigQuery. Similarly,
a Cloud Storage subscription offers configurable file batching options
based on file size and elapsed time, which are not configurable in the
basic Dataflow template for writing from
Pub/Sub to Cloud Storage.
However, a Dataflow pipeline is still
recommended for Pub/Sub systems where some data
transformation is required before the data is stored in a
Google Cloud resource such as a BigQuery table or
Cloud Storage bucket.
To learn how to stream data from Pub/Sub to
BigQuery with transformation by using Dataflow,
see Stream from Pub/Sub to BigQuery.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-07 UTC."],[],[],null,["# Choose a subscription type\n\nThis document helps you choose the appropriate type of Pub/Sub\nsubscription suited to your business requirements.\n\nBefore you begin\n----------------\n\n- Learn about [subscriptions](/pubsub/docs/subscription-overview).\n\nPub/Sub subscription comparison table\n-------------------------------------\n\nThe following table offers some guidance in choosing the appropriate delivery\nmechanism for your application:\n\nWhen to use an export subscription\n----------------------------------\n\nWithout an export subscription, you need a pull or push\nsubscription and a subscriber (such as Dataflow) to\nread messages and write them to a Google Cloud resource.\nThe overhead of running a Dataflow job is\nnot necessary when messages don't\nrequire additional processing before being stored.\n\nExport subscriptions have the following advantages:\n\n- **Simple deployment.** You can set up an export subscription through a\n single workflow in the console, Google Cloud CLI, client library,\n or Pub/Sub API.\n\n- **Low costs.** Reduces the additional cost and latency of similar\n Pub/Sub pipelines that include Dataflow jobs.\n This cost optimization is useful for messaging systems that don't require\n additional processing before storage.\n\n- **Minimal monitoring.** Export subscriptions are part of the multi-tenant\n Pub/Sub service and don't require you to run separate\n monitoring jobs.\n\n- **Flexibility**. A BigQuery subscription can use the\n schema of the topic to which it is attached, which is not available with\n the basic Dataflow template for writing from\n Pub/Sub to BigQuery. Similarly,\n a Cloud Storage subscription offers configurable file batching options\n based on file size and elapsed time, which are not configurable in the\n basic Dataflow template for writing from\n Pub/Sub to Cloud Storage.\n\nHowever, a Dataflow pipeline is still\nrecommended for Pub/Sub systems where some data\ntransformation is required before the data is stored in a\nGoogle Cloud resource such as a BigQuery table or\nCloud Storage bucket.\n\nTo learn how to stream data from Pub/Sub to\nBigQuery with transformation by using Dataflow,\nsee [Stream from Pub/Sub to BigQuery](/dataflow/docs/tutorials/dataflow-stream-to-bigquery).\n\nTo learn how to stream data from Pub/Sub to\nCloud Storage with transformation by using\nDataflow, see\n[Stream messages from Pub/Sub by using Dataflow](/pubsub/docs/stream-messages-dataflow).\n\nWhat's next\n-----------\n\nUnderstand the workflow for each subscription type:\n\n- [Pull](/pubsub/docs/pull)\n\n- [Push](/pubsub/docs/push)\n\n- [BigQuery](/pubsub/docs/bigquery)\n\n- [Cloud Storage](/pubsub/docs/cloudstorage)"]]