[[["易于理解","easyToUnderstand","thumb-up"],["解决了我的问题","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["很难理解","hardToUnderstand","thumb-down"],["信息或示例代码不正确","incorrectInformationOrSampleCode","thumb-down"],["没有我需要的信息/示例","missingTheInformationSamplesINeed","thumb-down"],["翻译问题","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["最后更新时间 (UTC):2025-09-04。"],[[["\u003cp\u003eThis document provides instructions on how to collect Carbon Black EDR logs from both Cloud and On-Prem environments, utilizing AWS S3 for log storage and Google SecOps for ingestion and analysis.\u003c/p\u003e\n"],["\u003cp\u003eThe process includes configuring Carbon Black EDR to forward logs to an Amazon S3 bucket, which involves setting up an S3 bucket and granting the necessary permissions for the Carbon Black EDR server to write events.\u003c/p\u003e\n"],["\u003cp\u003eGoogle SecOps is then configured to ingest these Carbon Black EDR logs from the S3 bucket, with options for log type, region, access credentials, and data deletion upon transfer.\u003c/p\u003e\n"],["\u003cp\u003eThe parser normalizes Carbon Black EDR logs from JSON, CSV, or syslog formats, extracting key fields such as network connections, process events, and file modifications, and maps them to the Unified Data Model (UDM).\u003c/p\u003e\n"],["\u003cp\u003eThe document details how to handle various Carbon Black event types and offers a UDM Mapping Table to cross reference between the fields of the logs, and their corresponding UDM mapping.\u003c/p\u003e\n"]]],[],null,["# Collect Carbon Black EDR logs\n=============================\n\nSupported in: \nGoogle secops [SIEM](/chronicle/docs/secops/google-secops-siem-toc)\n| **Note:** This feature is covered by [Pre-GA Offerings Terms](https://chronicle.security/legal/service-terms/) of the Google Security Operations Service Specific Terms. Pre-GA features might have limited support, and changes to pre-GA features might not be compatible with other pre-GA versions. For more information, see the [Google SecOps Technical Support Service guidelines](https://chronicle.security/legal/technical-support-services-guidelines/) and the [Google SecOps Service Specific Terms](https://chronicle.security/legal/service-terms/).\n\nThis document explains how to collect Carbon Black EDR logs from Cloud and on-premises environments using AWS S3. The parser extracts fields from JSON, CSV, or syslog formatted messages, normalizes them, and maps them to the UDM. It handles various Carbon Black event types, including network connections, process events, file modifications, registry changes, and IOC hits, enriching the data with threat intelligence and device information where available.\n\nBefore you begin\n----------------\n\nEnsure that you have the following prerequisites:\n\n- Google SecOps instance.\n- Privileged access to AWS IAM and S3.\n- Privileged access to Cloud or on-premises Carbon Black EDR.\n\nConfigure Carbon Black EDR on-premises\n--------------------------------------\n\n### Configure Amazon S3 bucket for on-premises\n\n1. Create an **Amazon S3 bucket** following this user guide: [Creating a bucket](https://docs.aws.amazon.com/AmazonS3/latest/userguide/creating-bucket.html)\n2. Save the bucket **Name** and **Region** for later use.\n3. Create a user following this user guide: [Creating an IAM user](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_users_create.html#id_users_create_console).\n4. Select the created **User**.\n5. Select the **Security credentials** tab.\n6. Click **Create Access Key** in the **Access Keys** section.\n7. Select **Third-party service** as the **Use case**.\n8. Click **Next**.\n9. Optional: add a description tag.\n10. Click **Create access key**.\n11. Click **Download CSV file** to save the **Access Key** and **Secret Access Key** for later use.\n12. Click **Done**.\n13. Select the **Permissions** tab.\n14. Click **Add permissions** in the **Permissions policies** section.\n15. Select **Add permissions**.\n16. Select **Attach policies directly**.\n17. Search for and select the **AmazonS3FullAccess** policy.\n18. Click **Next**.\n19. Click **Add permissions**.\n\n### Install cb-event-forwarder on on-premises EDR Server\n\n1. Install the **CbOpenSource** repository if it isn't already present:\n\n cd /etc/yum.repos.d\n curl -O https://opensource.carbonblack.com/release/x86_64/CbOpenSource.repo\n\n2. Install the **RPM** using **YUM**:\n\n yum install cb-event-forwarder\n\n3. If you're using EDR 7.1.0 or greater, run the following script to **set the appropriate permissions** needed by EDR:\n\n /usr/share/cb/integrations/event-forwarder/cb-edr-fix-permissions.sh\n\n### Configure cb-event-forwarder to Output JSON Logs\n\n1. Open the configuration file:\n\n sudo nano /etc/cb/integrations/event-forwarder/cb-event-forwarder.conf\n\n2. Modify the following parameters:\n\n [event_forwarder]\n output_format=json # Enable JSON format\n output_type=s3 # Send logs to AWS S3\n s3_bucket_name=YOUR-S3-BUCKET-NAME\n s3_region=YOUR-S3-BUCKET-NAME\n s3_access_key_id=YOUR_AWS_ACCESS_KEY\n s3_secret_access_key=YOUR_AWS_SECRET_KEY\n s3_prefix=carbonblack/edr/logs\n\n3. Save and exit using the keyboard:\n\n - Ctrl + X, then Y and Enter.\n4. Start cb-event-forwarder:\n\n sudo systemctl enable cb-event-forwarder\n sudo systemctl restart cb-event-forwarder\n sudo systemctl status cb-event-forwarder\n\nConfigure Carbon Black Cloud Event Forwarder for S3\n---------------------------------------------------\n\n### Create an AWS S3 Bucket\n\n1. Sign in to the AWS Management Console.\n2. Ensure that the AWS region matches the region of the Event Forwarder:\n 1. In the **AWS Console** page, locate the region.\n 2. Use the drop-down to select the correct region of your Event Forwarder.\n 3. The following list gives the applicable AWS Region for each Carbon Black EDR URL.\n - \"instance-alias\".my.carbonblack.io - Region: **US East (N. Virginia)** (us-east-1)\n - \"instance-alias\".my.cbcloud.de - Region: **Europe (Frankfurt)** (eu-central-1)\n - \"instance-alias\".my.cbcloud.sg Region: **Asia Pacific (Singapore)** (ap-southeast-1)\n3. Select **Services**.\n4. Go to the **S3** console.\n5. Click **Create bucket** to open the **Create bucket** wizard.\n 1. In **Bucket name** , enter a unique name for your bucket (for example, **CB-EDR**).\n 2. Ensure the **Region** defaults to the one you selected earlier.\n 3. Update the **Block Public Access** defaults to allow public access (this is required for ingesting the logs into Google SecOps).\n 4. Select **Create Bucket**.\n\n### Configure S3 Bucket to allow the Event Forwarder to write events\n\n1. Create a **User** following this user guide: [Creating an IAM user](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_users_create.html#id_users_create_console).\n2. Select the created **User**.\n3. Select the **Security credentials** tab.\n4. Click **Create Access Key** in the **Access Keys** section.\n5. Select **Third-party service** as the **Use case**.\n6. Click **Next**.\n7. Optional: add a description tag.\n8. Click **Create access key**.\n9. Click **Download CSV file** to save the **Access Key** and **Secret Access Key** for later use.\n10. Click **Done**.\n11. Select the **Permissions** tab.\n12. Click **Add permissions** in the **Permissions policies** section.\n13. Select **Add permissions**.\n14. Select **Attach policies directly**.\n15. Search for the **AmazonS3FullAccess** policy.\n16. Select the policy.\n17. Click **Next**.\n18. Click **Add permissions**.\n\n### Configure Events forwarding in the EDR Console\n\n1. Sign in to VMware Carbon Black Cloud.\n2. Go to the **event forwarder** tab\n3. Enable the events you would like the product to upload to S3.\n4. Go to **Output and Type** and set to **S3**.\n5. Provide the S3 bucket name in the following format `\u003cregion\u003e:\u003cbucket-name\u003e` (for example, `us-east-1:cb-edr`).\n6. Select **upload AWS credentials** file in INI format.\n7. The following is an example of a profile:\n\n AWS Access Key ID [None]: AKIAIOSFODNN7EXAMPLE\n AWS Secret Access Key [None]: wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY\n Default region name [None]: us-east-1\n\n8. Click **Save** and restart the service for the changes to take effect.\n\nSet up feeds\n------------\n\nTo configure a feed, follow these steps:\n\n1. Go to **SIEM Settings** \\\u003e **Feeds**.\n2. Click **Add New Feed**.\n3. On the next page, click **Configure a single feed**.\n4. In the **Feed name** field, enter a name for the feed (for example, **Carbon Black EDR Logs**).\n5. Select **Amazon S3 V2** as the **Source type**.\n6. Select **Carbon Black EDR** as the **Log type**.\n7. Click **Next**.\n8. Specify values for the following input parameters:\n\n - **S3 URI** : the bucket URI.\n - `s3:/BUCKET_NAME`\n - Replace `BUCKET_NAME` with the actual name of the bucket.\n - **Source deletion options**: select the deletion option according to your preference.\n\n | **Note:** If you select the `Delete transferred files` or `Delete transferred files and empty directories` option, make sure that you granted appropriate permissions to the service account. \\* **Maximum File Age** : Includes files modified in the last number of days. Default is 180 days. \\* **Access Key ID** : the User access key with access to the S3 bucket. \\* **Secret Access Key**: the User secret key with access to the S3 bucket.\n9. Click **Next**.\n\n10. Review your new feed configuration in the **Finalize** screen, and then click **Submit**.\n\nUDM Mapping Table\n-----------------\n\n**Need more help?** [Get answers from Community members and Google SecOps professionals.](https://security.googlecloudcommunity.com/google-security-operations-2)"]]