[[["易于理解","easyToUnderstand","thumb-up"],["解决了我的问题","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["很难理解","hardToUnderstand","thumb-down"],["信息或示例代码不正确","incorrectInformationOrSampleCode","thumb-down"],["没有我需要的信息/示例","missingTheInformationSamplesINeed","thumb-down"],["翻译问题","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["最后更新时间 (UTC):2025-08-17。"],[],[],null,["# Choose a subscription type\n\nThis document helps you choose the appropriate type of Pub/Sub\nsubscription suited to your business requirements.\n\nBefore you begin\n----------------\n\n- Learn about [subscriptions](/pubsub/docs/subscription-overview).\n\nPub/Sub subscription comparison table\n-------------------------------------\n\nThe following table offers some guidance in choosing the appropriate delivery\nmechanism for your application:\n\nWhen to use an export subscription\n----------------------------------\n\nWithout an export subscription, you need a pull or push\nsubscription and a subscriber (such as Dataflow) to\nread messages and write them to a Google Cloud resource.\nThe overhead of running a Dataflow job is\nnot necessary when messages don't\nrequire additional processing before being stored.\n\nExport subscriptions have the following advantages:\n\n- **Simple deployment.** You can set up an export subscription through a\n single workflow in the console, Google Cloud CLI, client library,\n or Pub/Sub API.\n\n- **Low costs.** Reduces the additional cost and latency of similar\n Pub/Sub pipelines that include Dataflow jobs.\n This cost optimization is useful for messaging systems that don't require\n additional processing before storage.\n\n- **Minimal monitoring.** Export subscriptions are part of the multi-tenant\n Pub/Sub service and don't require you to run separate\n monitoring jobs.\n\n- **Flexibility**. A BigQuery subscription can use the\n schema of the topic to which it is attached, which is not available with\n the basic Dataflow template for writing from\n Pub/Sub to BigQuery. Similarly,\n a Cloud Storage subscription offers configurable file batching options\n based on file size and elapsed time, which are not configurable in the\n basic Dataflow template for writing from\n Pub/Sub to Cloud Storage.\n\nHowever, a Dataflow pipeline is still\nrecommended for Pub/Sub systems where some data\ntransformation is required before the data is stored in a\nGoogle Cloud resource such as a BigQuery table or\nCloud Storage bucket.\n\nTo learn how to stream data from Pub/Sub to\nBigQuery with transformation by using Dataflow,\nsee [Stream from Pub/Sub to BigQuery](/dataflow/docs/tutorials/dataflow-stream-to-bigquery).\n\nTo learn how to stream data from Pub/Sub to\nCloud Storage with transformation by using\nDataflow, see\n[Stream messages from Pub/Sub by using Dataflow](/pubsub/docs/stream-messages-dataflow).\n\nWhat's next\n-----------\n\nUnderstand the workflow for each subscription type:\n\n- [Pull](/pubsub/docs/pull)\n\n- [Push](/pubsub/docs/push)\n\n- [BigQuery](/pubsub/docs/bigquery)\n\n- [Cloud Storage](/pubsub/docs/cloudstorage)"]]