Kode parser Logstash ini pertama-tama mengekstrak kolom seperti IP sumber, pengguna, metode, dan protokol aplikasi dari pesan log mentah menggunakan pola grok. Kemudian, memetakan kolom tertentu dari data log mentah ke kolom yang sesuai dalam Model Data Terpadu (UDM), melakukan konversi jenis data, dan memperkaya data dengan label dan metadata tambahan sebelum akhirnya menyusun output dalam format UDM yang diinginkan.
Sebelum memulai
Pastikan Anda memenuhi prasyarat berikut:
Instance Google Security Operations.
Akses istimewa ke Google Cloud.
Akses istimewa ke Qualys.
Aktifkan API yang Diperlukan:
Login ke konsol Google Cloud .
Buka APIs & Services>Library.
Telusuri dan aktifkan API berikut:
Cloud Functions API
Cloud Scheduler API
Cloud Pub/Sub (diperlukan agar Cloud Scheduler dapat memanggil fungsi)
Name: masukkan nama unik yang memenuhi persyaratan nama bucket (misalnya, qualys-asset-bucket).
Pilih tempat untuk menyimpan data Anda: pilih lokasi.
Pilih kelas penyimpanan untuk data Anda: pilih kelas penyimpanan default untuk bucket, atau pilih Autoclass untuk pengelolaan kelas penyimpanan otomatis.
Pilih cara mengontrol akses ke objek: pilih tidak untuk menerapkan pencegahan akses publik, dan pilih model kontrol akses untuk objek bucket Anda.
Kelas penyimpanan: pilih berdasarkan kebutuhan Anda (misalnya, Standard).
Klik Buat.
Buat Akun Layanan Google Cloud
Login ke konsol Google Cloud .
Buka IAM & Admin>Service Accounts.
Buat akun layanan baru.
Beri nama deskriptif (misalnya, qualys-user).
Beri akun layanan peran Storage Object Admin di bucket GCS yang Anda buat pada langkah sebelumnya.
Berikan peran Cloud Functions Invoker kepada akun layanan.
Scroll untuk melihat informasi ini di bagian Security Operations Center (SOC).
Salin URL Qualys API.
Mengonfigurasi Cloud Function
Buka Cloud Functions di konsol Google Cloud .
Klik Buat fungsi.
Konfigurasi Fungsi:
Nama: masukkan nama untuk fungsi Anda (misalnya, fetch-qualys-cm-alerts).
Region: pilih region yang dekat dengan Bucket Anda.
Runtime: Python 3.10 (atau runtime pilihan Anda).
Pemicu: pilih pemicu HTTP jika diperlukan atau Cloud Pub/Sub untuk eksekusi terjadwal.
Autentikasi: amankan dengan autentikasi.
Tulis Kode dengan editor inline:
```pythonfromgoogle.cloudimportstorageimportrequestsimportbase64importjson# Google Cloud Storage ConfigurationBUCKET_NAME="<bucket-name>"FILE_NAME="qualys_cm_alerts.json"# Qualys API CredentialsQUALYS_USERNAME="<qualys-username>"QUALYS_PASSWORD="<qualys-password>"QUALYS_BASE_URL="https://<qualys_base_url>"deffetch_cm_alerts():"""Fetch alerts from Qualys Continuous Monitoring."""auth=base64.b64encode(f"{QUALYS_USERNAME}:{QUALYS_PASSWORD}".encode()).decode()headers={"Authorization":f"Basic {auth}","Content-Type":"application/xml"}payload=""" <ServiceRequest> <filters> <Criteria field="alert.date" operator="GREATER">2024-01-01</Criteria> </filters> </ServiceRequest> """response=requests.post(f"{QUALYS_BASE_URL}/qps/rest/2.0/search/cm/alert",headers=headers,data=payload)response.raise_for_status()returnresponse.json()defupload_to_gcs(data):"""Upload data to Google Cloud Storage."""client=storage.Client()bucket=client.get_bucket(BUCKET_NAME)blob=bucket.blob(FILE_NAME)blob.upload_from_string(json.dumps(data,indent=2),content_type="application/json")defmain(request):"""Cloud Function entry point."""try:alerts=fetch_cm_alerts()upload_to_gcs(alerts)return"Qualys CM alerts uploaded to Cloud Storage successfully!"exceptExceptionase:returnf"An error occurred: {e}",500```
Klik Deploy setelah menyelesaikan konfigurasi.
Mengonfigurasi Cloud Scheduler
Buka Cloud Scheduler di konsol Google Cloud .
Klik Create Job.
Konfigurasi Tugas:
Nama: masukkan nama untuk tugas Anda (misalnya, trigger-fetch-qualys-cm-alerts).
Frekuensi: gunakan sintaksis cron untuk menentukan jadwal (misalnya, 0 * * * * untuk menjalankan setiap jam).
Zona Waktu: tetapkan zona waktu pilihan Anda.
Jenis Pemicu: pilih HTTP.
Trigger URL: Masukkan URL Cloud Function (ditemukan di detail fungsi setelah deployment).
Metode: Pilih POST.
Buat tugas.
Menyiapkan feed
Untuk mengonfigurasi feed, ikuti langkah-langkah berikut:
Buka Setelan SIEM>Feed.
Klik Tambahkan Feed Baru.
Di halaman berikutnya, klik Konfigurasi satu feed.
Di kolom Feed name, masukkan nama untuk feed; misalnya, Qualys Continuous Monitoring Logs.
Pilih Google Cloud Storage V2 sebagai Source type.
Pilih Qualys Continuous Monitoring sebagai Log type.
Klik Berikutnya.
Tentukan nilai untuk parameter input berikut:
URI Bucket Penyimpanan: URI sumber bucket penyimpanan Google Cloud .
Opsi penghapusan sumber: pilih opsi penghapusan sesuai preferensi Anda.
Klik Berikutnya.
Tinjau konfigurasi feed baru Anda di layar Selesaikan, lalu klik Kirim.
Tabel Pemetaan UDM
Kolom Log
Pemetaan UDM
Logika
Alert.alertInfo.appVersion
metadata.product_version
Dipetakan langsung dari Alert.alertInfo.appVersion
Alert.alertInfo.operatingSystem
principal.platform_version
Dipetakan langsung dari Alert.alertInfo.operatingSystem
Alert.alertInfo.port
additional.fields.value.string_value
Dipetakan langsung dari Alert.alertInfo.port dan ditambahkan sebagai pasangan nilai kunci di additional.fields dengan kunci "Alert port"
Alert.alertInfo.protocol
network.ip_protocol
Dipetakan langsung dari Alert.alertInfo.protocol
Alert.alertInfo.sslIssuer
network.tls.client.certificate.issuer
Dipetakan langsung dari Alert.alertInfo.sslIssuer
Alert.alertInfo.sslName
additional.fields.value.string_value
Dipetakan langsung dari Alert.alertInfo.sslName dan ditambahkan sebagai pasangan nilai kunci di additional.fields dengan kunci "SSL Name"
Alert.alertInfo.sslOrg
additional.fields.value.string_value
Dipetakan langsung dari Alert.alertInfo.sslOrg dan ditambahkan sebagai pasangan nilai kunci di additional.fields dengan kunci "SSL Org"
Alert.alertInfo.ticketId
additional.fields.value.string_value
Dipetakan langsung dari Alert.alertInfo.ticketId dan ditambahkan sebagai pasangan nilai kunci di additional.fields dengan kunci "Ticket Id"
Alert.alertInfo.vpeConfidence
additional.fields.value.string_value
Dipetakan langsung dari Alert.alertInfo.vpeConfidence dan ditambahkan sebagai key-value pair di additional.fields dengan kunci "VPE Confidence"
Alert.alertInfo.vpeStatus
additional.fields.value.string_value
Dipetakan langsung dari Alert.alertInfo.vpeStatus dan ditambahkan sebagai key-value pair di additional.fields dengan kunci "VPE Confidence"
Alert.eventType
additional.fields.value.string_value
Dipetakan langsung dari Alert.eventType dan ditambahkan sebagai pasangan nilai kunci di additional.fields dengan kunci "Jenis Peristiwa"
Alert.hostname
principal.hostname
Dipetakan langsung dari Alert.hostname
Alert.id
security_result.threat_id
Dipetakan langsung dari Alert.id
Alert.ipAddress
principal.ip
Dipetakan langsung dari Alert.ipAddress
Alert.profile.id
additional.fields.value.string_value
Dipetakan langsung dari Alert.profile.id dan ditambahkan sebagai pasangan nilai kunci di additional.fields dengan kunci "Profile Id"
Alert.profile.title
additional.fields.value.string_value
Dipetakan langsung dari Alert.profile.title dan ditambahkan sebagai pasangan nilai kunci di additional.fields dengan kunci "Judul Profil"
Alert.qid
vulnerability.name
Dipetakan sebagai "QID: " dari Alert.qid
Alert.source
additional.fields.value.string_value
Dipetakan langsung dari Alert.source dan ditambahkan sebagai pasangan nilai kunci di additional.fields dengan kunci "Sumber Pemberitahuan"
Alert.triggerUuid
metadata.product_log_id
Dipetakan langsung dari Alert.triggerUuid
Alert.vulnCategory
additional.fields.value.string_value
Dipetakan langsung dari Alert.vulnCategory dan ditambahkan sebagai pasangan nilai kunci di additional.fields dengan kunci "Vulnerability Category"
Alert.vulnSeverity
vulnerability.severity
Dipetakan berdasarkan nilai Alert.vulnSeverity: 1-3: RENDAH, 4-6: SEDANG, 7-8: TINGGI
Alert.vulnTitle
vulnerability.description
Dipetakan langsung dari Alert.vulnTitle
Alert.vulnType
additional.fields.value.string_value
Dipetakan langsung dari Alert.vulnType dan ditambahkan sebagai pasangan nilai kunci di additional.fields dengan kunci "Vulnerability Type"
Host
principal.ip
Diuraikan dari baris log "Host: "
edr.client.ip_addresses
Disalin dari principal.ip
edr.client.hostname
Disalin dari principal.hostname
edr.raw_event_name
Ditetapkan ke "STATUS_UPDATE" jika Alert.ipAddress, Alert.hostname, atau src_ip ada, jika tidak, ditetapkan ke "GENERIC_EVENT"
metadata.event_timestamp
Diekstrak dari kolom Alert.eventDate atau timestamp. Alert.eventDate diprioritaskan jika ada, jika tidak, timestamp akan digunakan. Stempel waktu dikonversi ke UTC.
metadata.event_type
Logika yang sama dengan edr.raw_event_name
metadata.log_type
Tetapkan ke "QUALYS_CONTINUOUS_MONITORING"
metadata.product_name
Tetapkan ke "QUALYS_CONTINUOUS_MONITORING"
metadata.vendor_name
Tetapkan ke "QUALYS_CONTINUOUS_MONITORING"
network.application_protocol
Diuraikan dari baris log " /user HTTP"
network.http.method
Diuraikan dari baris log " /user HTTP"
timestamp
event.timestamp
Diekstrak dari kolom Alert.eventDate atau timestamp. Alert.eventDate diprioritaskan jika ada, jika tidak, timestamp akan digunakan. Stempel waktu dikonversi ke UTC.
[[["Mudah dipahami","easyToUnderstand","thumb-up"],["Memecahkan masalah saya","solvedMyProblem","thumb-up"],["Lainnya","otherUp","thumb-up"]],[["Sulit dipahami","hardToUnderstand","thumb-down"],["Informasi atau kode contoh salah","incorrectInformationOrSampleCode","thumb-down"],["Informasi/contoh yang saya butuhkan tidak ada","missingTheInformationSamplesINeed","thumb-down"],["Masalah terjemahan","translationIssue","thumb-down"],["Lainnya","otherDown","thumb-down"]],["Terakhir diperbarui pada 2025-09-04 UTC."],[[["\u003cp\u003eThis guide details how to collect and ingest Qualys Continuous Monitoring logs into Google Security Operations (SecOps), including the necessary steps for setup, configuration, and UDM mapping.\u003c/p\u003e\n"],["\u003cp\u003eThe process involves enabling specific APIs in Google Cloud, creating a storage bucket, setting up a service account with proper permissions, and optionally creating a dedicated Qualys API user.\u003c/p\u003e\n"],["\u003cp\u003eA Google Cloud Function is used to fetch alerts from Qualys, and then store them in a GCS Bucket, while Cloud Scheduler is employed to automate the triggering of the function on a set schedule.\u003c/p\u003e\n"],["\u003cp\u003eThe UDM mapping table explains how Qualys log fields are transformed and mapped to their corresponding fields within the Unified Data Model, ensuring proper data normalization.\u003c/p\u003e\n"],["\u003cp\u003eOnce the data is present in the GCS bucket, a new feed must be created in the Google SecOps platform to ingest the information, setting the appropriate source type, log type and storage location, and selecting the appropriate ingestion labels.\u003c/p\u003e\n"]]],[],null,["# Collect Qualys Continuous Monitoring logs\n=========================================\n\nSupported in: \nGoogle secops [SIEM](/chronicle/docs/secops/google-secops-siem-toc)\n| **Note:** This feature is covered by [Pre-GA Offerings Terms](https://chronicle.security/legal/service-terms/) of the Google Security Operations Service Specific Terms. Pre-GA features might have limited support, and changes to pre-GA features might not be compatible with other pre-GA versions. For more information, see the [Google SecOps Technical Support Service guidelines](https://chronicle.security/legal/technical-support-services-guidelines/) and the [Google SecOps Service Specific Terms](https://chronicle.security/legal/service-terms/).\n\nThis Logstash parser code first extracts fields such as source IP, user, method, and application protocol from raw log messages using grok patterns. It thenmaps specific fields from the raw log data to their corresponding fields in the Unified Data Model (UDM), performs data type conversions, and enriches the data with additional labels and metadata before finally structuring the output in the desired UDM format.\n\nBefore you begin\n----------------\n\nEnsure that you have the following prerequisites:\n\n- Google Security Operations instance.\n- Privileged access to Google Cloud.\n- Privileged access to Qualys.\n\nEnable Required APIs:\n---------------------\n\n1. Sign in to the Google Cloud console.\n2. Go to **APIs \\& Services** \\\u003e **Library**.\n3. Search for the following APIs and enable them:\n - Cloud Functions API\n - Cloud Scheduler API\n - Cloud Pub/Sub (required for Cloud Scheduler to invoke functions)\n\nCreate a Google Cloud Storage Bucket\n------------------------------------\n\n1. Sign in to the Google Cloud console.\n2. Go to the **Cloud Storage Buckets** page.\n\n [Go to Buckets](https://console.cloud.google.com/storage/browser)\n3. Click **Create**.\n\n4. Configure the bucket:\n\n - **Name** : enter a unique name that meets the bucket name requirements (for example, **qualys-asset-bucket**).\n - **Choose where to store your data**: select a location.\n - **Choose a storage class for your data** : either select a **default storage class** for the bucket, or select **Autoclass** for automatic storage class management.\n - **Choose how to control access to objects** : select **not** to enforce **public access prevention** , and select an **access control model** for your bucket's objects.\n\n | **Note:** If public access prevention is already enforced by your project's organization policy, the **Prevent public access** checkbox is locked.\n - **Storage class** : choose based on your needs (for example, **Standard**).\n5. Click **Create**.\n\n| **Note:** Do not set a retention policy, as the last data entry may need to be overwritten in case of a timeout.\n\nCreate a Google Cloud Service Account\n-------------------------------------\n\n1. Sign in to the Google Cloud console.\n2. Go to to **IAM \\& Admin** \\\u003e **Service Accounts**.\n3. Create a new service account.\n4. Give it a descriptive name (for example, **qualys-user**).\n5. Grant the service account with **Storage Object Admin** role on the GCS bucket you created in the previous step.\n6. Grant the service account with **Cloud Functions Invoker** role.\n7. Create an [**SSH key**](/iam/docs/keys-create-delete) for the service account.\n8. Download a JSON key file for the service account. Keep this file secure.\n\nOptional: Create a dedicated API User in Qualys\n-----------------------------------------------\n\n1. Sign in to the Qualys console.\n2. Go to **Users**.\n3. Click **New** \\\u003e **User**.\n4. Enter the **General Information** required for the user.\n5. Select the **User Role** tab.\n6. Make sure the role has the **API Access** checkbox selected.\n7. Click **Save**.\n\nIdentify your specific Qualys API URL\n-------------------------------------\n\n### Option 1\n\nIdentify your URLs as mentioned in the [platform identification](https://www.qualys.com/platform-identification).\n\n### Option 2\n\n1. Sign in to the Qualys console.\n2. Go to **Help** \\\u003e **About**.\n3. Scroll to see this information under Security Operations Center (SOC).\n4. Copy the Qualys API URL.\n\nConfigure the Cloud Function\n----------------------------\n\n1. Go to **Cloud Functions** in the Google Cloud console.\n2. Click **Create Function**.\n3. Configure the Function:\n\n - **Name** : enter a name for your function (for example, **fetch-qualys-cm-alerts**).\n - **Region**: select a region close to your Bucket.\n - **Runtime**: Python 3.10 (or your preferred runtime).\n - **Trigger**: choose HTTP trigger if needed or Cloud Pub/Sub for scheduled execution.\n - **Authentication**: secure with authentication.\n - **Write the Code** with an inline editor:\n\n **Note:** Make sure to replace the following with your data: `\u003cbucket-name\u003e`, `\u003cqualys-username\u003e`, `\u003cqualys-password\u003e`, `\u003cqualys_base_url\u003e`. \n\n ```python\n from google.cloud import storage\n import requests\n import base64\n import json\n\n # Google Cloud Storage Configuration\n BUCKET_NAME = \"\u003cbucket-name\u003e\"\n FILE_NAME = \"qualys_cm_alerts.json\"\n\n # Qualys API Credentials\n QUALYS_USERNAME = \"\u003cqualys-username\u003e\"\n QUALYS_PASSWORD = \"\u003cqualys-password\u003e\"\n QUALYS_BASE_URL = \"https://\u003cqualys_base_url\u003e\"\n\n def fetch_cm_alerts():\n \"\"\"Fetch alerts from Qualys Continuous Monitoring.\"\"\"\n auth = base64.b64encode(f\"{QUALYS_USERNAME}:{QUALYS_PASSWORD}\".encode()).decode()\n headers = {\n \"Authorization\": f\"Basic {auth}\",\n \"Content-Type\": \"application/xml\"\n }\n payload = \"\"\"\n \u003cServiceRequest\u003e\n \u003cfilters\u003e\n \u003cCriteria field=\"alert.date\" operator=\"GREATER\"\u003e2024-01-01\u003c/Criteria\u003e\n \u003c/filters\u003e\n \u003c/ServiceRequest\u003e\n \"\"\"\n response = requests.post(f\"{QUALYS_BASE_URL}/qps/rest/2.0/search/cm/alert\", headers=headers, data=payload)\n response.raise_for_status()\n return response.json()\n\n def upload_to_gcs(data):\n \"\"\"Upload data to Google Cloud Storage.\"\"\"\n client = storage.Client()\n bucket = client.get_bucket(BUCKET_NAME)\n blob = bucket.blob(FILE_NAME)\n blob.upload_from_string(json.dumps(data, indent=2), content_type=\"application/json\")\n\n def main(request):\n \"\"\"Cloud Function entry point.\"\"\"\n try:\n alerts = fetch_cm_alerts()\n upload_to_gcs(alerts)\n return \"Qualys CM alerts uploaded to Cloud Storage successfully!\"\n except Exception as e:\n return f\"An error occurred: {e}\", 500\n ```\n\n4. Click **Deploy** after completing the configuration.\n\nConfigure Cloud Scheduler\n-------------------------\n\n1. Go to **Cloud Scheduler** in the Google Cloud console.\n2. Click **Create Job**.\n3. Configure the Job:\n\n - **Name** : enter a name for your job (for example, **trigger-fetch-qualys-cm-alerts**).\n - **Frequency** : use **cron** syntax to specify the schedule (for example, `0 * * * *` to run every hour).\n - **Time Zone**: set your preferred time zone.\n - **Trigger Type** : choose **HTTP**.\n - **Trigger URL**: Enter the Cloud Function's URL (found in the function details after deployment).\n - **Method** : Choose **POST**.\n\n | **Note:** If authentication is enabled for the function, select **Service Account** and ensure the account has the **Cloud Functions Invoker** role.\n4. Create the job.\n\nSet up feeds\n------------\n\nTo configure a feed, follow these steps:\n\n1. Go to **SIEM Settings** \\\u003e **Feeds**.\n2. Click **Add New Feed**.\n3. On the next page, click **Configure a single feed**.\n4. In the **Feed name** field, enter a name for the feed; for example, **Qualys Continuous Monitoring Logs**.\n5. Select **Google Cloud Storage V2** as the **Source type**.\n6. Select **Qualys Continuous Monitoring** as the **Log type**.\n7. Click **Next**.\n8. Specify values for the following input parameters:\n\n - **Storage Bucket URI**: the Google Cloud storage bucket source URI.\n - **Source deletion options**: select the deletion option according to your preference.\n\n | **Note:** If you select the `Delete transferred files` or `Delete transferred files and empty directories` option, make sure that you granted appropriate permissions to the service account. \\* **Maximum File Age**: Includes files modified in the last number of days. Default is 180 days.\n9. Click **Next**.\n\n10. Review your new feed configuration in the **Finalize** screen, and then click **Submit**.\n\nUDM Mapping Table\n-----------------\n\n**Need more help?** [Get answers from Community members and Google SecOps professionals.](https://security.googlecloudcommunity.com/google-security-operations-2)"]]