Collect Qualys asset context logs

Supported in:

This parser extracts asset context information from Qualys JSON logs and transforms it into the UDM format. It parses various fields such as ID, IP, hostname, cloud resource details, OS, and tags, mapping them to corresponding UDM fields and creating relationships between assets and resources. The parser also handles specific logic for cloud providers and operating systems, ensuring accurate representation in the UDM.

Before you begin

  • Ensure that you have a Google Security Operations instance.
  • Ensure that you have privileged access to Google Cloud.
  • Ensure that you have privileged access to Qualys.

Enable Required APIs:

  1. Sign in to the Google Cloud console.
  2. Go to APIs & Services > Library.
  3. Search for the following APIs and enable them:
    • Cloud Functions API
    • Cloud Scheduler API
    • Cloud Pub/Sub (required for Cloud Scheduler to invoke functions)

Create a Google Cloud Storage Bucket

  1. Sign in to the Google Cloud console.
  2. Go to the Cloud Storage Buckets page.

    Go to Buckets

  3. Click Create.

  4. Configure the bucket:

    • Name: enter a unique name that meets the bucket name requirements (for example, qualys-asset-bucket).
    • Choose where to store your data: select a location.
    • Choose a storage class for your data: either select a default storage class for the bucket, or select Autoclass for automatic storage class management.
    • Choose how to control access to objects: select not to enforce public access prevention, and select an access control model for your bucket's objects.
    • Storage class: choose based on your needs (for example, Standard).
  5. Click Create.

Create a Google Cloud Service Account

  1. Go to to IAM & Admin > Service Accounts.
  2. Create a new service account.
  3. Give it a descriptive name (for example, qualys-user).
  4. Grant the service account with Storage Object Admin role on the Cloud Storage bucket you created in the previous step.
  5. Grant the service account with Cloud Functions Invoker role.
  6. Create an SSH key for the service account.
  7. Download a JSON key file for the service account. Keep this file secure.

Optional: Create a dedicated API User in Qualys

  1. Sign in to the Qualys console.
  2. Go to Users.
  3. Click New > User.
  4. Enter the General Information required for the user.
  5. Select the User Role tab.
  6. Make sure the role has the API Access checkbox selected.
  7. Click Save.

Identify your specific Qualys API URL

Option 1

Identify your URLs as mentioned in the platform identification.

Option 2

  1. Sign in to the Qualys console.
  2. Go to Help > About.
  3. Scroll to see this information under Security Operations Center (SOC).
  4. Copy the Qualys API URL.

Configure the Cloud Function

  1. Go to Cloud Functions in the Google Cloud console.
  2. Click Create Function.
  3. Configure the Function:

    • Name: enter a name for your function (for example, fetch-qualys-assets).
    • Region: select a region close to your Bucket.
    • Trigger: choose HTTP trigger if needed or Cloud Pub/Sub for scheduled execution.
    • Authentication: secure with authentication.
    • Write the Code with an inline editor:
    ```python
    from google.cloud import storage
    import requests
    import base64
    import json
    
    # Cloud Storage configuration
    BUCKET_NAME = "<bucket-name>"
    FILE_NAME = "qualys_assets.json"
    
    # Qualys API credentials
    QUALYS_USERNAME = "<qualys-username>"
    QUALYS_PASSWORD = "<qualys-password>"
    QUALYS_BASE_URL = "https://<qualys_base_url>"
    
    def fetch_qualys_assets():
        auth = base64.b64encode(f"{QUALYS_USERNAME}:{QUALYS_PASSWORD}".encode()).decode()
        headers = {
            "Authorization": f"Basic {auth}",
            "Content-Type": "application/xml"
        }
        payload = """
        <ServiceRequest>
            <filters>
                <Criteria field="asset.name" operator="LIKE">%</Criteria>
            </filters>
        </ServiceRequest>
        """
        response = requests.post(f"{QUALYS_BASE_URL}/qps/rest/2.0/search/am/asset", headers=headers, data=payload)
        return response.json()
    
    def upload_to_gcs(data):
        client = storage.Client()
        bucket = client.get_bucket(BUCKET_NAME)
        blob = bucket.blob(FILE_NAME)
        blob.upload_from_string(json.dumps(data), content_type="application/json")
    
    def main(request):
        assets = fetch_qualys_assets()
        upload_to_gcs(assets)
        return "Data uploaded to Cloud Storage successfully!"
    
    ```
    
  4. Click Deploy after completing the configuration.

Configure Cloud Scheduler

  1. Go to Cloud Scheduler in the Google Cloud console.
  2. Click Create Job.
  3. Configure the Job:

    • Name: enter a name for your job (for example, trigger-fetch-qualys-assets).
    • Frequency: use cron syntax to specify the schedule (for example, 0 0 * * * for daily at midnight).
    • Time Zone: set your preferred time zone.
    • Trigger Type: Choose HTTP.
    • Trigger URL: Enter the Cloud Function's URL (found in the function details after deployment).
    • Method: Choose POST.
  4. Create the job.

Configure a feed in Google SecOps to ingest Qualys Asset Context logs

  1. Go to SIEM Settings > Feeds.
  2. Click Add new.
  3. In the Feed name field, enter a name for the feed (for example, Qualys Asset Context Logs).
  4. Select Google Cloud Storage as the Source type.
  5. Select Qualys Asset Context as the Log type.
  6. Click Next.
  7. Specify values for the following input parameters:

    • GCS URI: the Cloud Storage URI.
    • URI is a: select Single file.
    • Source deletion options: select the deletion option according to your preference.
    • Asset namespace: the asset namespace.
    • Ingestion labels: the label to be applied to the events from this feed.
  8. Click Next.

  9. Review your new feed configuration in the Finalize screen, and then click Submit.

UDM Mapping Table

Log Field UDM Mapping Logic
ASSET_ID entity.entity.asset.asset_id Directly mapped from the ASSET_ID field.
CLOUD_PROVIDER entity.relations.entity.resource.resource_subtype Directly mapped from the CLOUD_PROVIDER field.
CLOUD_PROVIDER_TAGS.CLOUD_TAG[].NAME entity.relations.entity.resource.attribute.labels.key Directly mapped from the CLOUD_PROVIDER_TAGS.CLOUD_TAG[].NAME field.
CLOUD_PROVIDER_TAGS.CLOUD_TAG[].VALUE entity.relations.entity.resource.attribute.labels.value Directly mapped from the CLOUD_PROVIDER_TAGS.CLOUD_TAG[].VALUE field.
CLOUD_RESOURCE_ID entity.relations.entity.resource.id Directly mapped from the CLOUD_RESOURCE_ID field.
CLOUD_SERVICE entity.relations.entity.resource.resource_type If CLOUD_SERVICE is "VM", the value is set to "VIRTUAL_MACHINE".
DNS_DATA.HOSTNAME entity.entity.asset.hostname Directly mapped from the DNS_DATA.HOSTNAME field.
EC2_INSTANCE_ID entity.relations.entity.resource.product_object_id Directly mapped from the EC2_INSTANCE_ID field.
ID entity.entity.asset.product_object_id Directly mapped from the ID field.
IP entity.entity.asset.ip Directly mapped from the IP field.
METADATA.AZURE.ATTRIBUTE[].NAME entity.relations.entity.resource.attribute.labels.key Directly mapped from the METADATA.AZURE.ATTRIBUTE[].NAME field.
METADATA.AZURE.ATTRIBUTE[].VALUE entity.relations.entity.resource.attribute.labels.value Directly mapped from the METADATA.AZURE.ATTRIBUTE[].VALUE field.
OS entity.entity.asset.platform_software.platform If OS contains "windows" (case-insensitive), the value is set to "WINDOWS".
TAGS.TAG[].NAME entity.relations.entity.resource.attribute.labels.key Directly mapped from the TAGS.TAG[].NAME field.
TAGS.TAG[].TAG_ID entity.relations.entity.resource.attribute.labels.value Concatenated string "TAG_ID: " with the value of TAGS.TAG[].TAG_ID. Copied from the create_time field of the raw log. Hardcoded to "ASSET". Hardcoded to "QUALYS ASSET CONTEXT". Hardcoded to "QUALYS ASSET CONTEXT". Hardcoded to "RESOURCE". Hardcoded to "MEMBER". Copied from the create_time field of the raw log.

Changes

2023-08-01

  • Mapped "DNS_DATA.HOSTNAME" to "entity.entity.asset.hostname".

2023-07-18

  • Newly created parser.