importjsonimportboto3fromdatetimeimportdatetime# Initialize the S3 clients3_client=boto3.client('s3')# S3 bucket where findings will be storedbucket_name='aws-security-hub-findings-stream'deflambda_handler(event,context):# Extract Security Hub findings from the eventfindings=event['detail']['findings']# Generate a timestamp for the file name to avoid overwritingtimestamp=datetime.now().strftime('%Y-%m-%dT%H-%M-%S')# Generate the S3 object key (file name) based on the timestampobject_key=f"security_hub_findings_{timestamp}.json"# Convert findings to JSON formatfindings_json=json.dumps(findings)# Upload the findings to S3try:response=s3_client.put_object(Bucket=bucket_name,Key=object_key,Body=findings_json,ContentType='application/json')print(f"Successfully uploaded findings to S3: {response}")exceptExceptionase:print(f"Error uploading findings to S3: {e}")raiseereturn{'statusCode':200,'body':json.dumps('Successfully processed findings')}
[[["易于理解","easyToUnderstand","thumb-up"],["解决了我的问题","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["很难理解","hardToUnderstand","thumb-down"],["信息或示例代码不正确","incorrectInformationOrSampleCode","thumb-down"],["没有我需要的信息/示例","missingTheInformationSamplesINeed","thumb-down"],["翻译问题","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["最后更新时间 (UTC):2025-09-04。"],[[["\u003cp\u003eThis guide outlines how to ingest AWS Security Hub logs into Google Security Operations (SecOps) to enhance security monitoring and threat detection.\u003c/p\u003e\n"],["\u003cp\u003eThe process involves configuring an Amazon S3 bucket, creating an IAM user with S3 access, and enabling EventBridge in AWS Security Hub to forward findings to the S3 bucket.\u003c/p\u003e\n"],["\u003cp\u003eGoogle SecOps users must create a new feed specifying the S3 bucket details, access keys, and log type (AWS Security Hub) to begin ingesting logs.\u003c/p\u003e\n"],["\u003cp\u003eThe document details the UDM (Unified Data Model) mapping table, showing how AWS Security Hub log fields are translated into the Google SecOps UDM format.\u003c/p\u003e\n"],["\u003cp\u003eThis process is a pre-GA offering and is subject to change, with users urged to consult Google's legal and technical support guidelines.\u003c/p\u003e\n"]]],[],null,["# Collect AWS Security Hub logs\n=============================\n\nSupported in: \nGoogle secops [SIEM](/chronicle/docs/secops/google-secops-siem-toc)\n| **Note:** This feature is covered by [Pre-GA Offerings Terms](https://chronicle.security/legal/service-terms/) of the Google Security Operations Service Specific Terms. Pre-GA features might have limited support, and changes to pre-GA features might not be compatible with other pre-GA versions. For more information, see the [Google SecOps Technical Support Service guidelines](https://chronicle.security/legal/technical-support-services-guidelines/) and the [Google SecOps Service Specific Terms](https://chronicle.security/legal/service-terms/).\n\nThis document explains how to ingest AWS Security Hub logs to Google Security Operations. AWS Security Hub provides a comprehensive view of security alerts and findings across AWS accounts. By sending these findings to Google SecOps, you can use Google SecOps capabilities to enhance monitoring and threat detection.\n\nBefore you begin\n----------------\n\nEnsure you have the following prerequisites:\n\n- Google SecOps instance\n- Privileged access to AWS\n\nConfigure AWS IAM and S3\n------------------------\n\n1. Create an **Amazon S3 bucket** following this user guide: [Creating a bucket](https://docs.aws.amazon.com/AmazonS3/latest/userguide/creating-bucket.html).\n2. Save the bucket **Name** and **Region** for later use.\n3. Create a user following this user guide: [Creating an IAM user](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_users_create.html#id_users_create_console).\n4. Select the created **User**.\n5. Select the **Security credentials** tab.\n6. Click **Create Access Key** in the **Access Keys** section.\n7. Select **Third-party service** as the **Use case**.\n8. Click **Next**.\n9. Optional: add a description tag.\n10. Click **Create access key**.\n11. Click **Download CSV file** to save the **Access Key** and **Secret Access Key** for later use.\n12. Click **Done**.\n13. Select the **Permissions** tab.\n14. Click **Add permissions** in the **Permissions policies** section.\n15. Select **Add permissions**.\n16. Select **Attach policies directly**.\n17. Search for and select the **AmazonS3FullAccess** policy.\n18. Click **Next**.\n19. Click **Add permissions**.\n\nCreate a Lambda function\n------------------------\n\n1. Sign in to the [AWS Management Console](https://aws.amazon.com/console/).\n2. Go to **Lambda**.\n3. Click **Create Function** and select **Author from Scratch**.\n4. Provide a name for your function; for example, `SecurityHubToS3`.\n5. Choose **Python 3.x** for the runtime.\n6. Enter the Lambda code that takes the findings from EventBridge and writes them to your S3 bucket:\n\n import json\n import boto3\n from datetime import datetime\n\n # Initialize the S3 client\n s3_client = boto3.client('s3')\n\n # S3 bucket where findings will be stored\n bucket_name = 'aws-security-hub-findings-stream'\n\n def lambda_handler(event, context):\n # Extract Security Hub findings from the event\n findings = event['detail']['findings']\n\n # Generate a timestamp for the file name to avoid overwriting\n timestamp = datetime.now().strftime('%Y-%m-%dT%H-%M-%S')\n\n # Generate the S3 object key (file name) based on the timestamp\n object_key = f\"security_hub_findings_{timestamp}.json\"\n\n # Convert findings to JSON format\n findings_json = json.dumps(findings)\n\n # Upload the findings to S3\n try:\n response = s3_client.put_object(\n Bucket=bucket_name,\n Key=object_key,\n Body=findings_json,\n ContentType='application/json'\n )\n print(f\"Successfully uploaded findings to S3: {response}\")\n except Exception as e:\n print(f\"Error uploading findings to S3: {e}\")\n raise e\n\n return {\n 'statusCode': 200,\n 'body': json.dumps('Successfully processed findings')\n }\n\n | **Note:** This code will store the findings as JSON files in your S3 bucket with timestamps.\n7. Set permissions for Lambda by adding an IAM role to the Lambda function with the following policy:\n\n {\n \"Version\": \"2012-10-17\",\n \"Statement\": [\n {\n \"Effect\": \"Allow\",\n \"Action\": [\n \"s3:PutObject\"\n ],\n \"Resource\": \"arn:aws:s3:::aws-security-hub-findings-stream/*\"\n }\n ]\n }\n\nHow to configure AWS Security Hub to forward findings with EventBridge\n----------------------------------------------------------------------\n\n1. Sign in to the [AWS Management Console](https://aws.amazon.com/console/).\n2. In the search bar, type and select **Security Hub** from the services list.\n3. Click **Settings**.\n4. Under the **Integrations** section, find **EventBridge** and click **Enable**.\n5. In the search bar, type and select **EventBridge** from the services list.\n6. In the EventBridge console, click **Rules \\\u003e Create rule**.\n7. Provide the following Rule configuration:\n 1. **Rule Name** : Provide a descriptive name for the rule; for example, **SendSecurityHubFindingsToS3**.\n 2. **Event Source** : Select **AWS services**.\n 3. **Service Name** : Choose **Security Hub**.\n 4. **Event Type** : Select **Security Hub Findings**.\n 5. **Set the Target** : Choose **Lambda function**.\n 6. Select the Lambda function you just created (`SecurityHubToS3`).\n8. Click **Create**.\n\nSet up feeds\n------------\n\nThere are two different entry points to set up feeds in the\nGoogle SecOps platform:\n\n- **SIEM Settings \\\u003e Feeds \\\u003e Add New**\n- **Content Hub \\\u003e Content Packs \\\u003e Get Started**\n\nHow to set up the AWS Security Hub feed\n---------------------------------------\n\n1. Click the **Amazon Cloud Platform** pack.\n2. Locate the **AWS Security Hub** log type.\n3. Specify the values in the following fields.\n\n - **Source Type**: Amazon SQS V2\n - **Queue Name**: The SQS queue name to read from\n - **S3 URI** : The bucket URI.\n - `s3://your-log-bucket-name/`\n - Replace `your-log-bucket-name` with the actual name of your S3 bucket.\n - **Source deletion options**: Select the deletion option according to your ingestion preferences.\n\n | **Note:** If you select the `Delete transferred files` or `Delete transferred files and empty directories` option, make sure that you granted appropriate permissions to the service account.\n - **Maximum File Age**: Include files modified in the last number of days. Default is 180 days.\n\n - **SQS Queue Access Key ID**: An account access key that is a 20-character alphanumeric string.\n\n - **SQS Queue Secret Access Key**: An account access key that is a 40-character alphanumeric string.\n\n **Advanced options**\n - **Feed Name**: A prepopulated value that identifies the feed.\n - **Asset Namespace**: Namespace associated with the feed.\n - **Ingestion Labels**: Labels applied to all events from this feed.\n4. Click **Create feed**.\n\n| **Note:** The Content Hub is not available on the SIEM standalone platform. To upgrade, contact your Google SecOps representative.\n\nFor more information about configuring multiple feeds for different log types within this product family, see [Configure feeds by product](/chronicle/docs/ingestion/ingestion-entities/configure-multiple-feeds).\n\nUDM Mapping Table\n-----------------\n\n**Need more help?** [Get answers from Community members and Google SecOps professionals.](https://security.googlecloudcommunity.com/google-security-operations-2)"]]