This parser processes Akamai DNS logs. It extracts fields like timestamps, source IP and port, query, DNS record type, and response details. It then maps these fields to the UDM, handling various DNS record types and potential SPF records. The parser classifies the event as either NETWORK_DNSor GENERIC_EVENT based on the presence of principal information.
Before you begin
Ensure that you have the following prerequisites:
Google SecOps instance.
Privileged access to AWS IAM and S3.
Your Akamai account has access to the Log Delivery Service.
Configure an Amazon S3 bucket
Create an Amazon S3 bucket following this user guide: Creating a bucket
Save the bucket Name and Region for future reference.
Ingestion labels: the label to be applied to the events from this feed.
Click Next.
Review your new feed configuration in the Finalize screen, and then click Submit.
UDM Mapping Table
Log Field
UDM Mapping
Logic
class
read_only_udm.network.dns.questions.class
If class is "IN", set to 1. Otherwise, attempt conversion to unsigned integer.
column11
read_only_udm.target.hostname
Mapped if it contains a hostname and doesn't contain specific patterns like "ip4", "=", ".net", or "10 mx0". Also used for extracting IP addresses, email addresses, and DNS authority data based on various patterns.
column11
read_only_udm.target.ip
Extracted from column11 if it matches the pattern for IP addresses within SPF records.
column11
read_only_udm.target.user.email_addresses
Extracted from column11 if it matches the pattern for email addresses within DMARC records.
column11
read_only_udm.network.dns.authority.data
Extracted from column11 if it matches patterns for domain names within various record types.
column11
read_only_udm.network.dns.response_code
Set to 3 if column11 contains "NXDOMAIN".
column2
read_only_udm.principal.ip
Mapped if it is a valid IP address.
column3
read_only_udm.principal.port
Mapped if it is a valid integer.
column4
read_only_udm.network.dns.questions.name
Directly mapped.
column6
read_only_udm.network.dns.questions.type
Mapped based on the value of type, using conditional logic to assign the corresponding numerical value.
column8
read_only_udm.network.sent_bytes
Converted to an unsigned integer and mapped.
read_only_udm.metadata.event_timestamp
Constructed from the date and time fields extracted from column1.
read_only_udm.event_type
Set to NETWORK_DNS if principal.ip is present, otherwise set to GENERIC_EVENT.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-29 UTC."],[[["\u003cp\u003eThis document provides instructions for collecting and parsing Akamai DNS logs within Google SecOps, enabling security teams to monitor and analyze DNS activity.\u003c/p\u003e\n"],["\u003cp\u003eThe setup involves configuring an Amazon S3 bucket to store the logs and an Akamai Log Delivery Service to send the logs to the S3 bucket, ensuring that an IAM user is set with proper permissions.\u003c/p\u003e\n"],["\u003cp\u003eGoogle SecOps ingests Akamai DNS logs via a configured feed, which requires the S3 bucket location and credentials, and supports different URI types and source deletion options.\u003c/p\u003e\n"],["\u003cp\u003eThe parser extracts and maps key data points from Akamai DNS logs, such as timestamps, IP addresses, DNS queries, and response codes, into the Unified Data Model (UDM), categorizing the event as either \u003ccode\u003eNETWORK_DNS\u003c/code\u003e or \u003ccode\u003eGENERIC_EVENT\u003c/code\u003e.\u003c/p\u003e\n"],["\u003cp\u003eThe configuration enables the collection of DNS Queries and DNS Responses Logs formats from the Akamai Log Delivery Service, allowing the use of filters to exclude or include specific logs based on various criteria.\u003c/p\u003e\n"]]],[],null,["# Collect Akamai DNS logs\n=======================\n\nSupported in: \nGoogle secops [SIEM](/chronicle/docs/secops/google-secops-siem-toc)\n| **Note:** This feature is covered by [Pre-GA Offerings Terms](https://chronicle.security/legal/service-terms/) of the Google Security Operations Service Specific Terms. Pre-GA features might have limited support, and changes to pre-GA features might not be compatible with other pre-GA versions. For more information, see the [Google SecOps Technical Support Service guidelines](https://chronicle.security/legal/technical-support-services-guidelines/) and the [Google SecOps Service Specific Terms](https://chronicle.security/legal/service-terms/).\n\nThis parser processes Akamai DNS logs. It extracts fields like timestamps, source IP and port, query, DNS record type, and response details. It then maps these fields to the UDM, handling various DNS record types and potential SPF records. The parser classifies the event as either `NETWORK_DNS`or `GENERIC_EVENT` based on the presence of principal information.\n\nBefore you begin\n----------------\n\nEnsure that you have the following prerequisites:\n\n- Google SecOps instance.\n- Privileged access to AWS IAM and S3.\n- Your Akamai account has access to the Log Delivery Service.\n\nConfigure an Amazon S3 bucket\n-----------------------------\n\n1. Create an **Amazon S3 bucket** following this user guide: [Creating a bucket](https://docs.aws.amazon.com/AmazonS3/latest/userguide/creating-bucket.html)\n2. Save the bucket **Name** and **Region** for future reference.\n3. Create a **User** following this user guide: [Creating an IAM user](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_users_create.html#id_users_create_console).\n4. Select the created **User**.\n5. Select the **Security credentials** tab.\n6. Click **Create Access Key** in the **Access Keys** section.\n7. Select **Third-party service** as the **Use case**.\n8. Click **Next**.\n9. Optional: Add a description tag.\n10. Click **Create access key**.\n11. Click **Download .csv file** and save the **Access Key** and **Secret Access Key** for future reference.\n12. Click **Done**.\n13. Select the **Permissions** tab.\n14. Click **Add permissions** in the **Permissions policies** section.\n15. Select **Add permissions**.\n16. Select **Attach policies directly**.\n17. Search for and select the **AmazonS3FullAccess** policy.\n18. Click **Next**.\n19. Click **Add permissions**.\n\nConfigure Log Delivery Service in Akamai\n----------------------------------------\n\n1. Sign in to the Akamai Control Center.\n2. Go to **Log Delivery Service** under **Data Services**.\n3. Click **Add New Configuration**.\n4. In the **Configuration Name** field, provide a name for your configuration (for example, **Edge DNS Logs to S3**).\n5. Select **Edge DNS** as the **Log Source**.\n6. Select **AWS S3** as the **Delivery Target**.\n7. Provide the following details:\n - **Bucket Name**: the name of your S3 bucket.\n - **Region**: the AWS region where your bucket is hosted.\n - **Access Key ID**: the IAM user Access Key ID.\n - **Secret Access Key**: the IAM user Secret Access Key.\n - Optional: specify the **Directory Structure** . (for example: `logs/akamai-dns/YYYY/MM/DD/HH/`).\n - Optional: set the **File Naming Convention** . (for example: `edge-dns-logs-{timestamp}.log`).\n8. Select the **Log Formats** you want to include:\n - DNS Queries\n - DNS Responses\n9. Choose the **Delivery Frequency** :\n - Options include hourly, daily, or upon reaching a certain file size (for example, 100MB).\n10. Optional: Click **Add Filters** to include or exclude specific logs based on specific criteria (for example, hostname or record type).\n11. Review the configuration details and click **Save and Activate**.\n\nSet up feeds\n------------\n\nTo configure a feed, follow these steps:\n\n1. Go to **SIEM Settings** \\\u003e **Feeds**.\n2. Click **Add New Feed**.\n3. On the next page, click **Configure a single feed**.\n4. In the **Feed name** field, enter a name for the feed (for example, **Akamai DNS Logs**).\n5. Select **Amazon S3** as the **Source type**.\n6. Select **Akamai DNS** as the **Log type**.\n7. Click **Next**.\n8. Specify values for the following input parameters:\n\n - **Region**: the region where the Amazon S3 bucket is located.\n - **S3 URI**: the bucket URI.\n\n - `s3://BUCKET_NAME`\n\n Replace the following:\n - **BUCKET_NAME**: the name of the bucket.\n - **URI is a** : select the `URI_TYPE` according to log stream configuration (**Single file** \\| **Directory** \\| **Directory which includes subdirectories**).\n\n - **Source deletion option**: select deletion option according to your preference.\n\n | **Note:** If you select the `Delete transferred files` or `Delete transferred files and empty directories` option, make sure that you granted appropriate permissions to the service account.\n - **Access Key ID**: the User access key with access to the s3 bucket.\n\n - **Secret Access Key**: the User secret key with access to the s3 bucket.\n\n - **Asset namespace** : the [asset namespace](/chronicle/docs/investigation/asset-namespaces).\n\n - **Ingestion labels**: the label to be applied to the events from this feed.\n\n9. Click **Next**.\n\n10. Review your new feed configuration in the **Finalize screen** , and then click **Submit**.\n\nUDM Mapping Table\n-----------------\n\n**Need more help?** [Get answers from Community members and Google SecOps professionals.](https://security.googlecloudcommunity.com/google-security-operations-2)"]]