This guide explains how to export Google Cloud IoT logs to Google Security Operations using Cloud Storage. The parser extracts fields from JSON-formatted logs and then maps those fields to the corresponding fields in the Google SecOps UDM schema, ultimately transforming raw log data into a structured format suitable for security analysis.
Before You Begin
Ensure that you have the following prerequisites:
Google SecOps instance.
IoT is set up and active in your Google Cloud environment.
On the Create a bucket page, enter your bucket information. After each of the following steps, click Continue to proceed to the next step:
In the Get started section, do the following:
Enter a unique name that meets the bucket name requirements; for example, cloudiot-logs.
To enable hierarchical namespace, click the expander arrow to expand the Optimize for file oriented and data-intensive workloads section, and then select Enable Hierarchical namespace on this bucket.
To add a bucket label, click the expander arrow to expand the Labels section.
Click Add label, and specify a key and a value for your label.
In the Choose where to store your data section, do the following:
Select a Location type.
Use the location type menu to select a Location where object data within your bucket will be permanently stored.
To set up cross-bucket replication, expand the Set up cross-bucket replication section.
In the Choose a storage class for your data section, either select a default storage class for the bucket, or select Autoclass for automatic storage class management of your bucket's data.
In the Choose how to control access to objects section, select not to enforce public access prevention, and select an access control model for your bucket's objects.
In the Choose how to protect object data section, do the following:
Select any of the options under Data protection that you want to set for your bucket.
To choose how your object data will be encrypted, click the expander arrow labeled Data encryption, and select a Data encryption method.
Click Create.
Configure Log Export in Google Cloud IoT
Sign in to Google Cloud account using your privileged account.
Search and select Logging in the search bar.
In Log Explorer, filter the logs by choosing Cloud IoT Core and click Apply.
Click More Actions.
Click Create Sink.
Provide the following configurations:
Sink Details: enter a name and description.
Click Next.
Sink Destination: select Cloud Storage Bucket.
Cloud Storage Bucket: select the bucket created earlier or create a new bucket.
Click Next.
Choose Logs to include in Sink: a default log is populated when you select an option in Cloud Storage Bucket.
Click Next.
Optional: Choose Logs to filter out of Sink: select the logs that you would like not to sink.
Click Create Sink.
In the GCP console, go to Logging > Log Router.
Click Create Sink.
Set up feeds
To configure a feed, follow these steps:
Go to SIEM Settings>Feeds.
Click Add New Feed.
On the next page, click Configure a single feed.
In the Feed name field, enter a name for the feed; for example, GCP Cloud IoT Logs.
Select Google Cloud Storage V2 as the Source type.
Select GCP Cloud IoT as the Log type.
Click Get Service Account as the Chronicle Service Account.
Click Next.
Specify values for the following input parameters:
Storage Bucket URI: Google Cloud storage bucket URL in gs://my-bucket/<value> format.
Source deletion options: select deletion option according to your preference.
Maximum File Age: Includes files modified in the last number of days. Default is 180 days
Click Next.
Review your new feed configuration in the Finalize screen, and then click Submit.
UDM Mapping Table
Log Field
UDM Mapping
Logic
insertId
metadata.product_log_id
Directly mapped from insertId field.
jsonPayload.eventType
metadata.product_event_type
Directly mapped from jsonPayload.eventType field.
jsonPayload.protocol
network.application_protocol
Directly mapped from jsonPayload.protocol field.
jsonPayload.serviceName
target.application
Directly mapped from jsonPayload.serviceName field.
jsonPayload.status.description
metadata.description
Directly mapped from jsonPayload.status.description field.
jsonPayload.status.message
security_result.description
Directly mapped from jsonPayload.status.message field.
labels.device_id
principal.asset_id
Value is set to Device ID: concatenated with the value of labels.device_id field.
receiveTimestamp
metadata.event_timestamp
Parsed from the receiveTimestamp field and used to populate both events.timestamp and metadata.event_timestamp.
resource.labels.device_num_id
target.resource.product_object_id
Directly mapped from resource.labels.device_num_id field.
resource.labels.location
target.location.name
Directly mapped from resource.labels.location field.
resource.labels.project_id
target.resource.name
Directly mapped from resource.labels.project_id field.
resource.type
target.resource.resource_subtype
Directly mapped from resource.type field.
severity
security_result.severity
Mapped from the severity field based on the following logic: - If severity is DEFAULT, DEBUG, INFO, or NOTICE, then security_result.severity is set to INFORMATIONAL. - If severity is WARNING or ERROR, then security_result.severity is set to MEDIUM. - If severity is CRITICAL, ALERT, or EMERGENCY, then security_result.severity is set to HIGH.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-29 UTC."],[[["\u003cp\u003eThis guide details how to export Cloud IoT logs to Google Security Operations (SecOps) by using Cloud Storage, enabling security analysis through structured log data.\u003c/p\u003e\n"],["\u003cp\u003eThe process involves creating a Google Cloud Storage bucket, configuring log export in Cloud IoT, and setting up a feed in Google SecOps to ingest the logs.\u003c/p\u003e\n"],["\u003cp\u003eBefore beginning, users must have a Google SecOps instance, active Cloud IoT setup, and privileged access to Google Cloud.\u003c/p\u003e\n"],["\u003cp\u003eCloud IoT log fields are extracted and mapped to corresponding fields in the Google SecOps Unified Data Model (UDM) schema for structured security analysis.\u003c/p\u003e\n"],["\u003cp\u003eThe feature to collect cloud IoT logs is supported in Google SecOps and is covered under the Pre-GA Offerings Terms, which may have limited support and could change.\u003c/p\u003e\n"]]],[],null,["# Collect Google Cloud IoT logs\n=============================\n\nSupported in: \nGoogle secops [SIEM](/chronicle/docs/secops/google-secops-siem-toc)\n| **Note:** This feature is covered by [Pre-GA Offerings Terms](https://chronicle.security/legal/service-terms/) of the Google Security Operations Service Specific Terms. Pre-GA features might have limited support, and changes to pre-GA features might not be compatible with other pre-GA versions. For more information, see the [Google SecOps Technical Support Service guidelines](https://chronicle.security/legal/technical-support-services-guidelines/) and the [Google SecOps Service Specific Terms](https://chronicle.security/legal/service-terms/).\n\nThis guide explains how to export Google Cloud IoT logs to Google Security Operations using Cloud Storage. The parser extracts fields from JSON-formatted logs and then maps those fields to the corresponding fields in the Google SecOps UDM schema, ultimately transforming raw log data into a structured format suitable for security analysis.\n\nBefore You Begin\n----------------\n\nEnsure that you have the following prerequisites:\n\n- Google SecOps instance.\n- IoT is set up and active in your Google Cloud environment.\n- Privileged access to Google Cloud.\n\nCreate a Google Cloud Storage Bucket\n------------------------------------\n\n1. Sign in to the [Google Cloud console](https://console.cloud.google.com/).\n2. Go to the **Cloud Storage Buckets** page.\n\n [Go to Buckets](https://console.cloud.google.com/storage/browser)\n3. Click **Create**.\n\n4. On the **Create a bucket** page, enter your bucket information. After each of the following steps, click **Continue** to proceed to the next step:\n\n 1. In the **Get started** section, do the following:\n\n 1. Enter a unique name that meets the bucket name requirements; for example, **cloudiot-logs**.\n 2. To enable hierarchical namespace, click the expander arrow to expand the **Optimize for file oriented and data-intensive workloads** section, and then select **Enable Hierarchical namespace on this bucket**.\n\n | **Note:** You cannot enable hierarchical namespace in an existing bucket.\n 3. To add a bucket label, click the expander arrow to expand the **Labels** section.\n\n 4. Click **Add label**, and specify a key and a value for your label.\n\n 2. In the **Choose where to store your data** section, do the following:\n\n 1. Select a **Location type**.\n 2. Use the location type menu to select a **Location** where object data within your bucket will be permanently stored.\n\n | **Note:** If you select the **dual-region** location type, you can also choose to enable **turbo replication** by using the relevant checkbox.\n 3. To set up cross-bucket replication, expand the **Set up cross-bucket replication** section.\n\n 3. In the **Choose a storage class for your data** section, either select a **default storage class** for the bucket, or select **Autoclass** for automatic storage class management of your bucket's data.\n\n 4. In the **Choose how to control access to objects** section, select **not** to enforce **public access prevention** , and select an **access control model** for your bucket's objects.\n\n | **Note:** If public access prevention is already enforced by your project's organization policy, the **Prevent public access** checkbox is locked.\n 5. In the **Choose how to protect object data** section, do the following:\n\n 1. Select any of the options under **Data protection** that you want to set for your bucket.\n 2. To choose how your object data will be encrypted, click the expander arrow labeled **Data encryption** , and select a **Data encryption method**.\n5. Click **Create**.\n\n| **Note:** Be sure to provide your Google SecOps Service Account with permissions to **Read** or **Read \\& Write** to the newly created bucket.\n\nConfigure Log Export in Google Cloud IoT\n----------------------------------------\n\n1. Sign in to **Google Cloud** account using your privileged account.\n2. Search and select **Logging** in the search bar.\n3. In **Log Explorer** , filter the logs by choosing **Cloud IoT Core** and click **Apply**.\n4. Click **More Actions**.\n5. Click **Create Sink**.\n6. Provide the following configurations:\n 1. **Sink Details**: enter a name and description.\n 2. Click **Next**.\n 3. **Sink Destination** : select **Cloud Storage Bucket**.\n 4. **Cloud Storage Bucket**: select the bucket created earlier or create a new bucket.\n 5. Click **Next**.\n 6. **Choose Logs to include in Sink**: a default log is populated when you select an option in Cloud Storage Bucket.\n 7. Click **Next**.\n 8. Optional: **Choose Logs to filter out of Sink**: select the logs that you would like not to sink.\n7. Click **Create Sink**.\n\n8. In the **GCP console** , go to **Logging \\\u003e Log Router**.\n\n9. Click **Create Sink**.\n\nSet up feeds\n------------\n\nTo configure a feed, follow these steps:\n\n1. Go to **SIEM Settings** \\\u003e **Feeds**.\n2. Click **Add New Feed**.\n3. On the next page, click **Configure a single feed**.\n4. In the **Feed name** field, enter a name for the feed; for example, **GCP Cloud IoT Logs**.\n5. Select **Google Cloud Storage V2** as the **Source type**.\n6. Select **GCP Cloud IoT** as the **Log type**.\n7. Click **Get Service Account** as the **Chronicle Service Account**.\n8. Click **Next**.\n9. Specify values for the following input parameters:\n\n - **Storage Bucket URI** : Google Cloud storage bucket URL in **`gs://my-bucket/\u003cvalue\u003e`** format.\n - **Source deletion options**: select deletion option according to your preference.\n\n | **Note:** If you select the `Delete transferred files` or `Delete transferred files and empty directories` option, make sure that you granted appropriate permissions to the service account.\n - **Maximum File Age**: Includes files modified in the last number of days. Default is 180 days\n\n10. Click **Next**.\n\n11. Review your new feed configuration in the **Finalize** screen, and then click **Submit**.\n\nUDM Mapping Table\n-----------------\n\n**Need more help?** [Get answers from Community members and Google SecOps professionals.](https://security.googlecloudcommunity.com/google-security-operations-2)"]]