This parser extracts key information such as timestamps, user IDs, source IPs, actions, and object IDs from JSON and SYSLOG formatted logs. It uses grok patterns to match various log message formats, handling variations in structure, and populates a unified data model (UDM) with the extracted fields. The parser also categorizes events based on the presence of user or IP information.
Before you begin
Ensure that you have the following prerequisites:
Google SecOps instance.
Privileged access to Google Cloud IAM.
Privileged access to Google Cloud Storage.
Privileged access to Jenkins.
Create a Google Cloud Storage Bucket
Go to Cloud Storage.
Create a new bucket. Choose a unique name and appropriate region.
Ensure the bucket has proper access controls (for example, only authorized service accounts can write to it).
Create a Google Cloud Service account
Go to IAM & Admin>Service Accounts.
Create a new service account. Give it a descriptive name (for example, jenkins-logs).
Grant the service account the Storage Object Creator role on the GCS bucket you created in the previous step.
Install the plugin and restart Jenkins if required.
Install Google OAuth Credentials Plugin in Jenkins
Go to Manage Jenkins>Plugins.
Select Available plugins
Search for the Google OAuth Credentials plugin.
Install the plugin and restart Jenkins if required.
Configure Jenkins to authenticate with Google Cloud
Go to Manage Jenkins>Credentials>System.
Click
add
Add Credentials.
Kind: select Google Service Account from private key.
Project name: set a name for the credentials.
Upload the JSON key file you obtained during the Google Cloud Service account creation.
Click Create.
Configure Jenkins logs to upload Google SecOps
In the Jenkins job configuration, add Google Storage Build Log Upload in post-build actions, with the following parameters:
Google Credentials: The name of your Google credentials you created in the previous step.
Log Name: The name of the file to store the Jenkins build log, under the specified storage path.
Storage Location: The name of the bucket where you want to upload your logs. The bucket must be accessible to the service account you created.
Test the log upload.
Set up feeds
To configure a feed, follow these steps:
Go to SIEM Settings>Feeds.
Click Add New Feed.
On the next page, click Configure a single feed.
In the Feed name field, enter a name for the feed; for example, Jenkins Logs.
Select Google Cloud Storage V2 as the Source type.
Select Jenkins as the Log type.
Click Get Service Account as the Chronicle Service Account.
Specify values for the following input parameters:
Storage Bucket URI: Google Cloud storage bucket URL in gs://my-bucket/<value> format.
Source deletion options: select deletion option according to your preference.
Click Create Feed.
UDM Mapping Table
Log Field
UDM Mapping
Logic
act
security_result.action_details
Extracted from msg1 or msg2 fields. Represents the action performed. Leading whitespace is removed.
data
principal.user.userid OR principal.ip OR metadata.description
If data matches an IP address pattern, it maps to principal.ip. If it matches a username pattern, it maps to principal.user.userid. Otherwise, it maps to metadata.description.
msg1
target.asset.product_object_id OR security_result.action_details
Used to extract object and act. If a / is present, it is split into object and act. If » is present, it is split into object and act. Otherwise, it is treated as act and potentially further parsed.
msg2
metadata.description OR security_result.action_details
If present, initially mapped to metadata.description. If it contains "completed:", the value after is extracted and mapped to security_result.action_details.
object
target.asset.product_object_id
Extracted from msg1. Represents the object acted upon.
object_id
target.resource.attribute.labels.value
Extracted from object if a / is present. Represents a more specific object identifier. The key is hardcoded as "Plugin Name".
src_ip
principal.ip
Extracted from message or data. Represents the source IP address.
user
principal.user.userid
Extracted from message or data. Represents the user associated with the event.
metadata.event_timestamp
Copied from the calculated @timestamp field.
metadata.event_type
Determined by parser logic. Set to USER_UNCATEGORIZED if user is present, STATUS_UNCATEGORIZED if src_ip is present, and GENERIC_EVENT otherwise.
metadata.product_name
Hardcoded as Jenkins.
metadata.product_version
Hardcoded as Jenkins.
metadata.vendor_name
Hardcoded as JENKINS.
metadata.event_timestamp
Constructed from year, month, day, time, and ampm fields.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-29 UTC."],[[["\u003cp\u003eThis guide explains how to collect Jenkins logs and send them to Google SecOps for analysis, using a parser to extract key data from JSON and SYSLOG formats.\u003c/p\u003e\n"],["\u003cp\u003eThe process involves creating a Google Cloud Storage bucket and service account, installing the Google Cloud Storage and OAuth Credentials plugins in Jenkins, and configuring Jenkins to authenticate with Google Cloud.\u003c/p\u003e\n"],["\u003cp\u003eJenkins logs are uploaded to a specified storage location through the configuration of post-build actions and setting Google Cloud as a destination.\u003c/p\u003e\n"],["\u003cp\u003eA feed in Google SecOps is configured to ingest the uploaded Jenkins logs, specifying the source type, log type, and storage bucket URI, along with other parameters for data handling.\u003c/p\u003e\n"],["\u003cp\u003eThe parser will map the Jenkins logs data into the unified data model (UDM), specifying the mapping between the fields in Jenkins logs and UDM fields.\u003c/p\u003e\n"]]],[],null,["# Collect Jenkins logs\n====================\n\nSupported in: \nGoogle secops [SIEM](/chronicle/docs/secops/google-secops-siem-toc)\n| **Note:** This feature is covered by [Pre-GA Offerings Terms](https://chronicle.security/legal/service-terms/) of the Google Security Operations Service Specific Terms. Pre-GA features might have limited support, and changes to pre-GA features might not be compatible with other pre-GA versions. For more information, see the [Google SecOps Technical Support Service guidelines](https://chronicle.security/legal/technical-support-services-guidelines/) and the [Google SecOps Service Specific Terms](https://chronicle.security/legal/service-terms/).\n\nOverview\n--------\n\nThis parser extracts key information such as timestamps, user IDs, source IPs, actions, and object IDs from JSON and SYSLOG formatted logs. It uses grok patterns to match various log message formats, handling variations in structure, and populates a unified data model (UDM) with the extracted fields. The parser also categorizes events based on the presence of user or IP information.\n\nBefore you begin\n----------------\n\nEnsure that you have the following prerequisites:\n\n- Google SecOps instance.\n- Privileged access to Google Cloud IAM.\n- Privileged access to Google Cloud Storage.\n- Privileged access to Jenkins.\n\nCreate a Google Cloud Storage Bucket\n------------------------------------\n\n1. Go to **Cloud Storage**.\n2. Create a new bucket. Choose a unique name and appropriate region.\n3. Ensure the bucket has proper access controls (for example, only authorized service accounts can write to it).\n\nCreate a Google Cloud Service account\n-------------------------------------\n\n1. Go to **IAM \\& Admin** \\\u003e **Service Accounts**.\n2. Create a new service account. Give it a descriptive name (for example, **jenkins-logs**).\n3. Grant the service account the **Storage Object Creator** role on the GCS bucket you created in the previous step.\n4. Create an SSH key for your service account: [Create and delete service account keys](/iam/docs/keys-create-delete).\n5. Download a JSON key file for the service account.\n\n | **Note:** Keep this file secure. You will need it for the **Google OAuth Credentials** plugin to create credentials.\n\nInstall Google Cloud Storage plugin in Jenkins\n----------------------------------------------\n\n1. Go to **Manage Jenkins** \\\u003e **Plugins**.\n2. Select **Available plugins**.\n3. Search for the **Google Cloud Storage** plugin.\n4. Install the plugin and restart Jenkins if required.\n\nInstall Google OAuth Credentials Plugin in Jenkins\n--------------------------------------------------\n\n1. Go to **Manage Jenkins** \\\u003e **Plugins**.\n2. Select **Available plugins**\n3. Search for the **Google OAuth Credentials** plugin.\n4. Install the plugin and restart Jenkins if required.\n\nConfigure Jenkins to authenticate with Google Cloud\n---------------------------------------------------\n\n1. Go to **Manage Jenkins** \\\u003e **Credentials** \\\u003e **System**.\n\n | **Note:** You can use **Global Credentials** or add a new domain (recommended).\n2. Click add **Add Credentials**.\n\n3. **Kind** : select **Google Service Account from private key**.\n\n4. **Project name**: set a name for the credentials.\n\n5. Upload the JSON key file you obtained during the Google Cloud Service account creation.\n\n6. Click **Create**.\n\nConfigure Jenkins logs to upload Google SecOps\n----------------------------------------------\n\n1. In the Jenkins job configuration, add **Google Storage Build Log Upload** in post-build actions, with the following parameters:\n - **Google Credentials**: The name of your Google credentials you created in the previous step.\n - **Log Name**: The name of the file to store the Jenkins build log, under the specified storage path.\n - **Storage Location**: The name of the bucket where you want to upload your logs. The bucket must be accessible to the service account you created.\n2. Test the log upload.\n\nSet up feeds\n------------\n\nTo configure a feed, follow these steps:\n\n1. Go to **SIEM Settings** \\\u003e **Feeds**.\n2. Click **Add New Feed**.\n3. On the next page, click **Configure a single feed**.\n4. In the **Feed name** field, enter a name for the feed; for example, **Jenkins Logs**.\n5. Select **Google Cloud Storage V2** as the **Source type**.\n6. Select **Jenkins** as the **Log type**.\n7. Click **Get Service Account** as the **Chronicle Service Account**.\n8. Specify values for the following input parameters:\n\n - **Storage Bucket URI** : Google Cloud storage bucket URL in **`gs://my-bucket/\u003cvalue\u003e`** format.\n - **Source deletion options**: select deletion option according to your preference.\n\n | **Note:** If you select the `Delete transferred files` or `Delete transferred files and empty directories` option, make sure that you granted appropriate permissions to the service account. \\* **Maximum File Age**: Includes files modified in the last number of days. Default is 180 days.\n9. Click **Create Feed**.\n\nUDM Mapping Table\n-----------------\n\n**Need more help?** [Get answers from Community members and Google SecOps professionals.](https://security.googlecloudcommunity.com/google-security-operations-2)"]]