This parser extracts fields from Cloudflare Web Application Firewall (WAF) JSON logs, transforms and maps them to the Unified Data Model (UDM). It handles various Cloudflare actions, enriching the data with metadata and network information before structuring the output into the UDM format.
Name: enter a unique name that meets the bucket name requirements (for example, cloudflare-waf).
Choose where to store your data: select a location.
Choose a storage class for your data: either select a default storage class for the bucket, or select Autoclass for automatic storage class management.
Choose how to control access to objects: select not to enforce public access prevention, and select an access control model for your bucket's objects.
Storage class: Choose based on your needs (for example, Standard).
Click Create.
Grant bucket permissions to Cloudflare IAM user
In Google Cloud, go to Storage>Browser>Bucket>Permissions.
Add the account logpush@cloudflare-data.iam.gserviceaccount.com with Storage Object Admin permission.
Create a Logpush Job for WAF Logs using Cloudflare UI
Sign in to Cloudflare.
Go to Analytics & Logs>Logpush.
Select Create a Logpush job.
In Select a destination, choose Google Cloud Storage.
Enter the following destination details:
Bucket: Google Cloud Storage bucket name
Path: Bucket location within the storage container
Select Organize logs into daily subfolders
Click Continue.
Select the Security (WAF) dataset to push to the storage.
Configure the logpush job:
Enter the Job name.
Under If logs match, you can select the events to include and/or remove from your logs. Refer to Filters for more information. Not all datasets have this option available.
In Send the following fields, you can choose to either push all logs to your storage destination or selectively choose which logs you want to push.
Click Submit.
Set up feeds
To configure a feed, follow these steps:
Go to SIEM Settings>Feeds.
Click Add New Feed.
On the next page, click Configure a single feed.
In the Feed name field, enter a name for the feed (for example, Cloudflare WAF Logs).
Select Google Cloud Storage V2 as the Source type.
Select Cloudflare WAF as the Log type.
Click Get Service Account.
Click Next.
Specify values for the following input parameters:
Storage Bucket URI: the Cloud Storage URL.
Source deletion options: select the deletion option according to your preference.
Click Next.
Review your new feed configuration in the Finalize screen, and then click Submit.
UDM Mapping Table
Log Field
UDM Mapping
Logic
Action
security_result.action_details
The value of Action from the raw log is directly assigned to this UDM field.
Action
security_result.action
The value of this field is derived from the Action field in the raw log. If Action is "allow", the UDM field is set to ALLOW. If Action is "challengeSolved", "jschallengeSolved", "managedchallengenoninteractivesolved", or "managedchallengeinteractivesolved", the UDM field is set to ALLOW_WITH_MODIFICATION. If Action is "drop", "block", or "connectionclose", the UDM field is set to BLOCK. If Action is "challengefailed" or "jschallengefailed", the UDM field is set to FAIL. Otherwise, it's set to UNKNOWN_ACTION.
ClientASN
network.asn
The value of ClientASN from the raw log is directly assigned to this UDM field after converting it to a string.
ClientASNDescription
additional.fields.key
The key is statically set to "ClientASNDescription".
ClientASNDescription
additional.fields.value.string_value
The value of ClientASNDescription from the raw log is directly assigned to this UDM field.
ClientCountry
principal.location.country_or_region
The value of ClientCountry from the raw log is directly assigned to this UDM field.
ClientIP
principal.ip
The value of ClientIP from the raw log is directly assigned to this UDM field.
ClientRefererHost
intermediary.hostname
The value of ClientRefererHost from the raw log is directly assigned to this UDM field.
ClientRefererPath
network.http.referral_url
The value of ClientRefererPath from the raw log is directly assigned to this UDM field.
ClientRequestHost
target.hostname
The value of ClientRequestHost from the raw log is directly assigned to this UDM field.
ClientRequestMethod
network.http.method
The value of ClientRequestMethod from the raw log is directly assigned to this UDM field.
ClientRequestPath
target.file.full_path
The value of ClientRequestPath from the raw log is directly assigned to this UDM field.
ClientRequestProtocol
network.application_protocol
The protocol part of ClientRequestProtocol (e.g., "HTTP" from "HTTP/1.1") is extracted using grok, converted to uppercase, and assigned to this UDM field.
ClientRequestUserAgent
network.http.user_agent
The value of ClientRequestUserAgent from the raw log is directly assigned to this UDM field.
Datetime
metadata.event_timestamp
The value of Datetime from the raw log is parsed as an RFC 3339 timestamp and assigned to this UDM field.
EdgeColoCode
additional.fields.key
The key is statically set to "EdgeColoCode".
EdgeColoCode
additional.fields.value.string_value
The value of EdgeColoCode from the raw log is directly assigned to this UDM field.
EdgeResponseStatus
network.http.response_code
The value of EdgeResponseStatus from the raw log is directly assigned to this UDM field and converted to an integer.
Kind
metadata.product_event_type
The value of Kind from the raw log is directly assigned to this UDM field.
Metadata.filter
target.resource.attribute.labels.value
The value of Metadata.filter from the raw log is assigned to the value field of a label within target.resource.attribute.labels. The key for this label is statically set to "Metadata filter".
Metadata.type
target.resource.attribute.labels.value
The value of Metadata.type from the raw log is assigned to the value field of a label within target.resource.attribute.labels. The key for this label is statically set to "Metadata type". The value of this field is derived based on the presence and values of ClientIP, ClientRequestHost, and app_protocol. See parser code for the specific logic. Statically set to "Cloudflare". Statically set to "Cloudflare log Aggregator". Statically set to "CLOUDFLARE_WAF".
RayID
metadata.product_log_id
The value of RayID from the raw log is directly assigned to this UDM field.
RuleID
security_result.rule_id
The value of RuleID from the raw log is directly assigned to this UDM field.
Source
security_result.rule_name
The value of Source from the raw log is directly assigned to this UDM field.
timestamp
metadata.event_timestamp, events.timestamp
The value of timestamp from the raw log is directly assigned to these UDM fields.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-29 UTC."],[[["\u003cp\u003eThis guide explains how to collect and ingest Cloudflare Web Application Firewall (WAF) logs into Google SecOps, using a parser that transforms and maps the data to the Unified Data Model (UDM).\u003c/p\u003e\n"],["\u003cp\u003eTo start, you need a Google SecOps instance, privileged access to both Google Cloud and Cloudflare, and a Cloudflare Enterprise plan to create a Google Cloud Storage bucket and set the necessary permissions.\u003c/p\u003e\n"],["\u003cp\u003eA Logpush job must be created within the Cloudflare UI, specifying the Google Cloud Storage bucket as the destination and selecting the "Security (WAF)" dataset for log collection.\u003c/p\u003e\n"],["\u003cp\u003eWithin Google SecOps, a new feed must be configured, selecting "Google Cloud Storage" as the source and "Cloudflare WAF" as the log type, also adding necessary details like the GCS URI and source deletion options.\u003c/p\u003e\n"],["\u003cp\u003eThe Cloudflare WAF logs are mapped to the UDM format, including mapping \u003ccode\u003eAction\u003c/code\u003e, \u003ccode\u003eClientIP\u003c/code\u003e, \u003ccode\u003eClientASN\u003c/code\u003e, \u003ccode\u003eClientRequestHost\u003c/code\u003e and other key fields to their corresponding UDM fields like \u003ccode\u003esecurity_result.action_details\u003c/code\u003e, \u003ccode\u003eprincipal.ip\u003c/code\u003e, and \u003ccode\u003etarget.hostname\u003c/code\u003e.\u003c/p\u003e\n"]]],[],null,["# Collect Cloudflare WAF logs\n===========================\n\nSupported in: \nGoogle secops [SIEM](/chronicle/docs/secops/google-secops-siem-toc)\n| **Note:** This feature is covered by [Pre-GA Offerings Terms](https://chronicle.security/legal/service-terms/) of the Google Security Operations Service Specific Terms. Pre-GA features might have limited support, and changes to pre-GA features might not be compatible with other pre-GA versions. For more information, see the [Google SecOps Technical Support Service guidelines](https://chronicle.security/legal/technical-support-services-guidelines/) and the [Google SecOps Service Specific Terms](https://chronicle.security/legal/service-terms/).\n\nThis parser extracts fields from Cloudflare Web Application Firewall (WAF) JSON logs, transforms and maps them to the Unified Data Model (UDM). It handles various Cloudflare actions, enriching the data with metadata and network information before structuring the output into the UDM format.\n\nBefore you begin\n----------------\n\nEnsure that you have the following prerequisites:\n\n- Google SecOps instance.\n- Privileged access to Google Cloud.\n- Cloudflare Enterprise plan.\n- Privileged access to Cloudflare.\n\nCreate a Google Cloud Storage Bucket\n------------------------------------\n\n1. Sign in to the Google Cloud console.\n2. Go to the **Cloud Storage Buckets** page.\n\n [Go to Buckets](https://console.cloud.google.com/storage/browser)\n3. Click **Create**.\n\n4. Configure the bucket:\n\n - **Name** : enter a unique name that meets the bucket name requirements (for example, **cloudflare-waf**).\n - **Choose where to store your data**: select a location.\n - **Choose a storage class for your data** : either select a **default storage class** for the bucket, or select **Autoclass** for automatic storage class management.\n - **Choose how to control access to objects** : select **not** to enforce **public access prevention** , and select an **access control model** for your bucket's objects.\n\n | **Note:** If public access prevention is already enforced by your project's organization policy, the **Prevent public access** checkbox is locked.\n - **Storage class** : Choose based on your needs (for example, **Standard**).\n5. Click **Create**.\n\n| **Note:** Do not set a retention policy, as the last data entry may need to be overwritten in case of a timeout.\n\nGrant bucket permissions to Cloudflare IAM user\n-----------------------------------------------\n\n1. In Google Cloud, go to **Storage** \\\u003e **Browser** \\\u003e **Bucket** \\\u003e **Permissions**.\n2. Add the account logpush@cloudflare-data.iam.gserviceaccount.com with Storage Object Admin permission.\n\nCreate a Logpush Job for WAF Logs using Cloudflare UI\n-----------------------------------------------------\n\n1. Sign in to Cloudflare.\n2. Go to **Analytics \\& Logs** \\\u003e **Logpush**.\n3. Select **Create a Logpush job**.\n4. In **Select a destination** , choose **Google Cloud Storage**.\n5. Enter the following destination details:\n - **Bucket**: Google Cloud Storage bucket name\n - **Path**: Bucket location within the storage container\n - Select **Organize logs into daily subfolders**\n6. Click **Continue**.\n\n| **Note:** To prove ownership, Cloudflare will send a file to your designated destination. To find the token, select the Open button in the **Overview** tab of the ownership challenge file, then paste it into the Cloudflare dashboard to verify your access to the bucket. Enter the Ownership Token and select Continue.\n\n1. Select the **Security (WAF)** dataset to push to the storage.\n2. Configure the logpush job:\n - Enter the **Job name**.\n - Under If logs match, you can select the events to include and/or remove from your logs. Refer to **Filters** for more information. Not all datasets have this option available.\n - In **Send the following** fields, you can choose to either push all logs to your storage destination or selectively choose which logs you want to push.\n3. Click **Submit**.\n\nSet up feeds\n------------\n\nTo configure a feed, follow these steps:\n\n1. Go to **SIEM Settings** \\\u003e **Feeds**.\n2. Click **Add New Feed**.\n3. On the next page, click **Configure a single feed**.\n4. In the **Feed name** field, enter a name for the feed (for example, **Cloudflare WAF Logs**).\n5. Select **Google Cloud Storage V2** as the **Source type**.\n6. Select **Cloudflare WAF** as the **Log type**.\n7. Click **Get Service Account**.\n8. Click **Next**.\n9. Specify values for the following input parameters:\n\n - **Storage Bucket URI**: the Cloud Storage URL.\n - **Source deletion options**: select the deletion option according to your preference.\n\n | **Note:** If you select the `Delete transferred files` or `Delete transferred files and empty directories` option, make sure that you granted appropriate permissions to the service account. \\* **Maximum File Age**: Includes files modified in the last number of days. Default is 180 days.\n10. Click **Next**.\n\n11. Review your new feed configuration in the **Finalize** screen, and then click **Submit**.\n\nUDM Mapping Table\n-----------------\n\n**Need more help?** [Get answers from Community members and Google SecOps professionals.](https://security.googlecloudcommunity.com/google-security-operations-2)"]]