Pipeline menghubungkan bus ke tujuan target, dan merutekan pesan peristiwa ke tujuan tersebut. Anda dapat mengonfigurasi pipeline untuk mengharapkan data peristiwa dalam format tertentu atau, sebelum peristiwa dikirimkan ke tujuan, Anda dapat mengonversi data peristiwa dari satu format yang didukung ke format lainnya. Misalnya, Anda mungkin perlu merutekan peristiwa ke endpoint yang hanya menerima data Avro.
Format yang didukung
Konversi format berikut didukung:
Avro ke JSON
Avro ke Protobuf
JSON ke Avro
JSON ke Protobuf
Protobuf ke Avro
Protobuf ke JSON
Perhatikan hal berikut:
Saat Anda mengonversi format acara, hanya payload acara yang dikonversi, bukan seluruh pesan acara.
Jika format data masuk ditentukan untuk pipeline, semua peristiwa harus sesuai dengan format tersebut. Semua peristiwa yang tidak sesuai dengan format yang diharapkan akan dianggap sebagai
error persisten.
Jika format data masuk tidak ditentukan untuk pipeline, format
keluar tidak dapat ditetapkan.
Sebelum format peristiwa dikonversi untuk tujuan tertentu, transformasi data yang dikonfigurasi akan diterapkan terlebih dahulu.
Skema JSON terdeteksi secara dinamis. Untuk definisi skema Protobuf, Anda
hanya dapat menentukan satu jenis tingkat teratas dan pernyataan impor yang merujuk ke jenis
lain tidak didukung. Definisi skema tanpa ID syntax
secara default adalah proto2. Perhatikan bahwa ada
batas ukuran skema.
Mengonfigurasi pipeline untuk memformat peristiwa
Anda dapat mengonfigurasi pipeline untuk mengharapkan data peristiwa dalam format tertentu, atau untuk mengonversi data peristiwa dari satu format ke format lain, di konsol Google Cloud atau dengan menggunakan gcloud CLI.
Konsol
Di konsol Google Cloud , buka halaman Eventarc>Pipelines.
Anda dapat membuat pipeline
atau, jika Anda memperbarui pipeline, klik nama pipeline.
Di halaman Pipeline details, klik
editEdit.
Di panel Event mediation, lakukan hal berikut:
Centang kotak Terapkan transformasi.
Dalam daftar Format masuk, pilih format yang sesuai.
Perhatikan bahwa jika format data masuk ditentukan untuk pipeline, semua
peristiwa harus sesuai dengan format tersebut. Semua peristiwa yang tidak sesuai dengan
format yang diharapkan akan dianggap sebagai
error persisten.
Untuk format Avro atau Protobuf, Anda harus menentukan skema masuk.
(Jika ingin, alih-alih menentukannya secara langsung, Anda dapat mengupload skema masuk.)
Di kolom ekspresi CEL, tulis ekspresi transformasi
menggunakan CEL.
Klik Lanjutkan.
Di panel Tujuan, lakukan hal berikut:
Jika berlaku, pilih format dalam daftar Format keluar.
Perhatikan bahwa jika format data masuk tidak ditentukan untuk pipeline, format keluar tidak dapat ditetapkan.
Opsional: Terapkan Pengikatan pesan. Untuk mengetahui informasi selengkapnya, lihat
Pengikatan pesan.
Klik Simpan.
Diperlukan waktu beberapa menit untuk memperbarui pipeline.
Atau, Anda dapat menetapkan properti lokasi gcloud CLI:
gcloudconfigseteventarc/locationREGION
INPUT_PAYLOAD_FLAG: tanda format data input
yang dapat berupa salah satu dari berikut ini:
--input-payload-format-avro-schema-definition
--input-payload-format-json
--input-payload-format-protobuf-schema-definition
Perhatikan bahwa jika format data input ditentukan untuk pipeline, semua
peristiwa harus sesuai dengan format tersebut. Semua peristiwa yang tidak sesuai dengan
format yang diharapkan akan dianggap sebagai
error persisten.
OUTPUT_PAYLOAD_KEY: kunci format data output
yang dapat berupa salah satu dari berikut ini:
output_payload_format_avro_schema_definition
output_payload_format_json
output_payload_format_protobuf_schema_definition
Perhatikan bahwa jika Anda menetapkan kunci format data output, Anda juga harus
menentukan tanda format data input.
Diperlukan waktu beberapa menit untuk memperbarui pipeline.
Contoh:
Contoh berikut menggunakan tanda
--input-payload-format-protobuf-schema-definition untuk menentukan bahwa
pipeline harus menerima peristiwa dalam format data Protobuf dengan
skema tertentu:
Contoh berikut menggunakan
kunci output_payload_format_avro_schema_definition dan tanda
--input-payload-format-avro-schema-definition untuk membuat
pipeline yang mengharapkan peristiwa dalam format Avro dan menampilkannya dalam
format yang sama:
Contoh berikut menggunakan kunci
output_payload_format_protobuf_schema_definition dan tanda
--input-payload-format-avro-schema-definition untuk memperbarui
pipeline dan mengonversi data peristiwanya dari Avro ke Protobuf menggunakan definisi
skema:
[[["Mudah dipahami","easyToUnderstand","thumb-up"],["Memecahkan masalah saya","solvedMyProblem","thumb-up"],["Lainnya","otherUp","thumb-up"]],[["Sulit dipahami","hardToUnderstand","thumb-down"],["Informasi atau kode contoh salah","incorrectInformationOrSampleCode","thumb-down"],["Informasi/contoh yang saya butuhkan tidak ada","missingTheInformationSamplesINeed","thumb-down"],["Masalah terjemahan","translationIssue","thumb-down"],["Lainnya","otherDown","thumb-down"]],["Terakhir diperbarui pada 2025-08-18 UTC."],[[["\u003cp\u003eEventarc Advanced, currently in pre-GA, allows you to configure pipelines to manage the format of event data, including converting between Avro, JSON, and Protobuf.\u003c/p\u003e\n"],["\u003cp\u003ePipelines can be set up to expect a specific inbound data format, and all incoming events must match this format, or they will be treated as persistent errors.\u003c/p\u003e\n"],["\u003cp\u003eWhen converting event formats, only the payload is transformed, not the entire event message, and inbound and outbound schema formats must be specified appropriately.\u003c/p\u003e\n"],["\u003cp\u003eYou can configure pipelines and manage event format conversion through the Google Cloud console or the gcloud CLI, noting that updating a pipeline may take over 10 minutes.\u003c/p\u003e\n"],["\u003cp\u003ePipelines can be set to use data transformations prior to the formatting of the data, and the formatted data can be sent with a CloudEvents format, unless a message binding is specified.\u003c/p\u003e\n"]]],[],null,["# Format received events\n\n[Advanced](/eventarc/advanced/docs/overview)\n\nA pipeline connects a bus to a target destination, and routes event messages to\nthat destination. You can configure a pipeline to expect event data in a\nspecific format or, before events are delivered to a destination, you can\nconvert event data from one supported format to another. For example, you might\nneed to route events to an endpoint that only accepts Avro data.\n\nSupported formats\n-----------------\n\nThe following format conversions are supported:\n\n- Avro to JSON\n- Avro to Protobuf\n- JSON to Avro\n- JSON to Protobuf\n- Protobuf to Avro\n- Protobuf to JSON\n\n#### Note the following:\n\n- When you convert the format of events, *only* the event payload is converted\n and not the entire event message.\n\n- If an inbound data format is specified for a pipeline, all events must match\n that format. Any events that don't match the expected format are treated as\n [persistent errors](/eventarc/advanced/docs/retry-events#persistent).\n\n- If an inbound data format is *not* specified for a pipeline, an outbound\n format *can't* be set.\n\n- Before an event format is converted for a specific destination, any\n [data transformation](/eventarc/advanced/docs/receive-events/transform-events)\n that is configured is applied first.\n\n- Events are always delivered in a\n [CloudEvents format using an HTTP request in binary content mode](/eventarc/docs/cloudevents)\n unless you specify a [message binding](/eventarc/advanced/docs/receive-events/transform-events#message-binding).\n\n- JSON schemas are detected dynamically. For Protobuf schema definitions, you\n can define only one top-level type and import statements that refer to other\n types are not supported. Schema definitions without a `syntax` identifier\n default to `proto2`. Note that there is a\n [schema size limit](/eventarc/docs/quotas#limits).\n\nConfigure a pipeline to format events\n-------------------------------------\n\nYou can configure a pipeline to expect event data in a specific format, or to\nconvert event data from one format to another, in the Google Cloud console or by\nusing the gcloud CLI. \n\n### Console\n\n1. In the Google Cloud console, go to the **Eventarc**\n \\\u003e **Pipelines** page.\n\n\n [Go to Pipelines](https://console.cloud.google.com/eventarc/pipelines)\n\n \u003cbr /\u003e\n\n2. You can [create a pipeline](/eventarc/advanced/docs/receive-events/create-enrollment#console)\n or, if you are updating a pipeline, click the name of the pipeline.\n\n3. In the **Pipeline details** page, click\n edit\n **Edit**.\n\n4. In the **Event mediation** pane, do the following:\n\n 1. Select the **Apply a transformation** checkbox.\n 2. In the **Inbound format** list, select the applicable format.\n\n Note that if an inbound data format is specified for a pipeline, all\n events must match that format. Any events that don't match the\n expected format are treated as\n [persistent errors](/eventarc/advanced/docs/retry-events#persistent).\n 3. For Avro or Protobuf formats, you must specify an inbound schema.\n (Optionally, instead of specifying it directly, you can upload an inbound\n schema.)\n\n 4. In the **CEL expression** field, write a transformation expression\n [using CEL](/eventarc/advanced/docs/receive-events/use-cel).\n\n 5. Click **Continue**.\n\n5. In the **Destination** pane, do the following:\n\n 1. If applicable, in the **Outbound format** list, select a format.\n\n Note that if an inbound data format is *not* specified for a pipeline,\n an outbound format *can't* be set.\n 2. Optional: Apply a **Message binding** . For more information, see\n [Message binding](/eventarc/advanced/docs/receive-events/transform-events#message-binding).\n\n6. Click **Save**.\n\n It can take a couple of minutes to update a pipeline.\n\n### gcloud\n\n1. Open a terminal.\n\n2. You can [create a pipeline](/eventarc/advanced/docs/receive-events/create-enrollment#gcloud)\n or you can update a pipeline using the\n [`gcloud eventarc pipelines update`](/sdk/gcloud/reference/eventarc/pipelines/update)\n command:\n\n ```bash\n gcloud eventarc pipelines update PIPELINE_NAME \\\n --location=REGION \\\n --INPUT_PAYLOAD_FLAG \\\n --destinations=OUTPUT_PAYLOAD_KEY\n ```\n\n Replace the following:\n - \u003cvar translate=\"no\"\u003ePIPELINE_NAME\u003c/var\u003e: the ID of the pipeline or a fully qualified name\n - \u003cvar translate=\"no\"\u003eREGION\u003c/var\u003e: a\n [supported Eventarc Advanced location](/eventarc/docs/locations#advanced-regions)\n\n Alternatively, you can set the gcloud CLI location\n property: \n\n gcloud config set eventarc/location \u003cvar translate=\"no\"\u003eREGION\u003c/var\u003e\n\n - \u003cvar translate=\"no\"\u003eINPUT_PAYLOAD_FLAG\u003c/var\u003e: an input data format\n flag that can be one of the following:\n\n - `--input-payload-format-avro-schema-definition`\n - `--input-payload-format-json`\n - `--input-payload-format-protobuf-schema-definition`\n\n Note that if an input data format is specified for a pipeline, all\n events must match that format. Any events that don't match the\n expected format are treated as\n [persistent errors](/eventarc/advanced/docs/retry-events#persistent).\n - \u003cvar translate=\"no\"\u003eOUTPUT_PAYLOAD_KEY\u003c/var\u003e: an output data format key\n that can be one of the following:\n\n - `output_payload_format_avro_schema_definition`\n - `output_payload_format_json`\n - `output_payload_format_protobuf_schema_definition`\n\n Note that if you set an output data format key, you must also\n specify an input data format flag.\n\n It can take a couple of minutes to update a pipeline.\n\n ### Examples:\n\n The following example use an\n `--input-payload-format-protobuf-schema-definition` flag to specify that\n the pipeline should expect events in a Protobuf data format with a\n specific schema: \n\n ```bash\n gcloud eventarc pipelines update my-pipeline \\\n --input-payload-format-protobuf-schema-definition \\\n '\n syntax = \"proto3\";\n message schema {\n string name = 1;\n string severity = 2;\n }\n '\n ```\n\n The following example uses an\n `output_payload_format_avro_schema_definition` key and an\n `--input-payload-format-avro-schema-definition` flag to create a\n pipeline that expects events in an Avro format and outputs them in the\n same format: \n\n ```bash\n gcloud eventarc pipelines create my-pipeline \\\n --location=us-central1 \\\n --destinations=http_endpoint_uri='https://example-endpoint.com',output_payload_format_avro_schema_definition='{\"type\": \"record\", \"name\": \"my_record\", \"fields\": [{\"name\": \"my_field\", \"type\": \"string\"}]}' \\\n --input-payload-format-avro-schema-definition='{\"type\": \"record\", \"name\": \"my_record\", \"fields\": [{\"name\": \"my_field\", \"type\": \"string\"}]}'\n ```\n\n The following example uses an\n `output_payload_format_protobuf_schema_definition` key and an\n `--input-payload-format-avro-schema-definition` flag to update a\n pipeline and convert its event data from Avro to Protobuf using schema\n definitions: \n\n ```bash\n gcloud eventarc pipelines update my-pipeline \\\n --location=us-central1 \\\n --destinations=output_payload_format_protobuf_schema_definition='message MessageProto {string prop1 = 1; string prop2 = 2;}' \\\n --input-payload-format-avro-schema-definition= \\\n '\n {\n \"type\": \"record\",\n \"name\": \"MessageProto\",\n \"fields\": [\n { \"name\" : \"prop1\", \"type\": \"string\" },\n { \"name\" : \"prop2\", \"type\": \"string\" },\n ]\n }\n '\n ```"]]