[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-09-07 UTC."],[],[],null,["Collect Akamai DataStream 2 logs \nSupported in: \nGoogle secops [SIEM](/chronicle/docs/secops/google-secops-siem-toc)\n| **Note:** This feature is covered by [Pre-GA Offerings Terms](https://chronicle.security/legal/service-terms/) of the Google Security Operations Service Specific Terms. Pre-GA features might have limited support, and changes to pre-GA features might not be compatible with other pre-GA versions. For more information, see the [Google SecOps Technical Support Service guidelines](https://chronicle.security/legal/technical-support-services-guidelines/) and the [Google SecOps Service Specific Terms](https://chronicle.security/legal/service-terms/).\n\nThis document explains how to ingest Akamai DataStream 2 logs to Google Security Operations using Amazon S3.\n\nBefore you begin\n\nMake sure you have the following prerequisites:\n\n- Google SecOps instance\n- Privileged access to **Akamai Control Center** (DataStream 2 configuration access)\n- Privileged access to **AWS** (S3, IAM)\n\nConfigure AWS S3 bucket and IAM for Google SecOps\n\n1. Create **Amazon S3 bucket** following this user guide: [Creating a bucket](https://docs.aws.amazon.com/AmazonS3/latest/userguide/creating-bucket-overview.html)\n2. Save bucket **Name** and **Region** for future reference (for example, `akamai-cloud-monitor`).\n3. Create a user following this user guide: [Creating an IAM user](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_users_create.html#id_users_create_console).\n4. Select the created **User**.\n5. Select the **Security credentials** tab.\n6. Click **Create Access Key** in the **Access Keys** section.\n7. Select **Third-party service** as the **Use case**.\n8. Click **Next**.\n9. Optional: add a description tag.\n10. Click **Create access key**.\n11. Click **Download CSV file** to save the **Access Key** and **Secret Access Key** for later use.\n12. Click **Done**.\n13. Select the **Permissions** tab.\n14. Click **Add permissions** in the **Permissions policies** section.\n15. Select **Add permissions**.\n16. Select **Attach policies directly**\n17. Search for and select the **AmazonS3FullAccess** policy.\n18. Click **Next**.\n19. Click **Add permissions**.\n\nConfigure the IAM policy and role for S3 uploads\n\n1. In the **AWS console** , go to **IAM \\\u003e Policies \\\u003e Create policy \\\u003e JSON tab**.\n2. Enter the following policy:\n\n {\n \"Version\": \"2012-10-17\",\n \"Statement\": [\n {\n \"Sid\": \"AllowAkamaiWriteToS3\",\n \"Effect\": \"Allow\",\n \"Action\": [\"s3:PutObject\"],\n \"Resource\": \"arn:aws:s3:::akamai-datastream-2-logs/akamai/datastream2/json/*\"\n }\n ]\n }\n\n - Replace `akamai-datastream-2-logs` if you entered a different bucket name.\n3. Click **Next \\\u003e Create policy**.\n\n4. Go to **IAM \\\u003e Users \\\u003e Create user**.\n\n5. Name the user `akamai-datastream-writer`.\n\n6. Attach the newly created policy.\n\n7. Create access keys for this user to use in Akamai DataStream 2 configuration.\n\nConfigure Akamai DataStream 2 to deliver logs to Amazon S3\n\n1. In **Akamai Control Center** go to **DataStream 2**.\n2. Click **Create a stream**.\n3. **Select log type** appropriate for your property (for example, **Delivery** , **Edge DNS** , **GTM**).\n4. In **Data sets**, select the fields you require. Keep defaults unless you have a specific need.\n5. Go to **Delivery \\\u003e Destination** and select **Amazon S3**.\n6. Fill the S3 destination details using the newly created bucket:\n - **Bucket** : `akamai-datastream-2-logs`\n - **Folder path** : `akamai/datastream2/json/`\n - **Region**: Your bucket region\n - **Access key ID**: The User access key created earlier\n - **Secret access key**: The User secret access key created earlier\n7. Set **Log format** to **JSON**.\n8. Optional: In **Delivery options** , set **Push frequency** to **30 seconds**.\n9. Click **Validate \\& Save \\\u003e Next \\\u003e Activate**.\n\nOptional: Create read-only IAM user \\& keys for Google SecOps\n\n1. Go to **AWS Console \\\u003e IAM \\\u003e Users \\\u003e Add users**.\n2. Click **Add users**.\n3. Provide the following configuration details:\n - **User** : Enter `secops-reader`.\n - **Access type** : Select **Access key -- Programmatic access**.\n4. Click **Create user**.\n5. Attach minimal read policy (custom): **Users \\\u003e secops-reader \\\u003e Permissions \\\u003e Add permissions \\\u003e Attach policies directly \\\u003e Create policy**.\n6. In the JSON editor, enter the following policy:\n\n {\n \"Version\": \"2012-10-17\",\n \"Statement\": [\n {\n \"Effect\": \"Allow\",\n \"Action\": [\"s3:GetObject\"],\n \"Resource\": \"arn:aws:s3:::akamai-datastream-2-logs/*\"\n },\n {\n \"Effect\": \"Allow\",\n \"Action\": [\"s3:ListBucket\"],\n \"Resource\": \"arn:aws:s3:::akamai-datastream-2-logs\"\n }\n ]\n }\n\n7. Set the name to `secops-reader-policy`.\n\n8. Go to **Create policy \\\u003e search/select \\\u003e Next \\\u003e Add permissions**.\n\n9. Go to **Security credentials \\\u003e Access keys \\\u003e Create access key**.\n\n10. Download the **CSV** (these values are entered into the feed).\n\nConfigure a feed in Google SecOps to ingest Akamai DataStream 2 logs\n\n1. Go to **SIEM Settings \\\u003e Feeds**.\n2. Click **+ Add New Feed**.\n3. In the **Feed name** field, enter a name for the feed (for example, `Akamai DataStream 2 logs`).\n4. Select **Amazon S3 V2** as the **Source type**.\n5. Select **Akamai DataStream 2** as the **Log type**.\n6. Click **Next**.\n7. Specify values for the following input parameters:\n - **S3 URI** : `s3://akamai-datastream-2-logs/akamai/datastream2/json/`\n - **Source deletion options**: Select deletion option according to your preference.\n - **Maximum File Age**: Include files modified in the last number of days. Default is 180 days.\n - **Access Key ID**: User access key with access to the S3 bucket.\n - **Secret Access Key**: User secret key with access to the S3 bucket.\n - **Asset namespace** : The [asset namespace](/chronicle/docs/investigation/asset-namespaces).\n - **Ingestion labels**: The label applied to the events from this feed.\n8. Click **Next**.\n9. Review your new feed configuration in the **Finalize** screen, and then click **Submit**.\n\n**Need more help?** [Get answers from Community members and Google SecOps professionals.](https://security.googlecloudcommunity.com/google-security-operations-2)"]]