Esse analisador extrai informações de contexto de recursos dos registros JSON da Qualys e as transforma no formato UDM. Ele analisa vários campos, como ID, IP, nome do host, detalhes de recursos da nuvem, SO e tags, mapeando-os para os campos correspondentes da UDM e criando relações entre ativos e recursos. O analisador também processa uma lógica específica para provedores de nuvem e sistemas operacionais, garantindo uma representação precisa na UDM.
Antes de começar
Verifique se você atende aos seguintes pré-requisitos:
Instância do Google Security Operations.
Acesso privilegiado ao Google Cloud.
Acesso privilegiado ao Qualys.
Ative as APIs obrigatórias:
Faça login no console do Google Cloud .
Acesse APIs e serviços>Biblioteca.
Procure e ative as seguintes APIs:
API Cloud Functions
API Cloud Scheduler
Cloud Pub/Sub (necessário para o Cloud Scheduler invocar funções)
Nome: insira um nome exclusivo que atenda aos requisitos de nome de bucket (por exemplo, qualys-asset-bucket).
Escolha onde armazenar seus dados: selecione um local.
Escolha uma classe de armazenamento para seus dados: selecione uma classe de armazenamento padrão para o bucket ou escolha Classe automática para gerenciamento automático da classe de armazenamento.
Escolha como controlar o acesso a objetos: selecione não para aplicar a prevenção de acesso público e escolha um modelo de controle de acesso para os objetos do bucket.
Classe de armazenamento: escolha com base nas suas necessidades (por exemplo, Padrão).
Clique em Criar.
Criar uma conta de serviço do Google Cloud
Acesse IAM e administrador>Contas de serviço.
Crie uma nova conta de serviço.
Dê um nome descritivo a ele (por exemplo, qualys-user).
Conceda à conta de serviço o papel Administrador de objetos do Storage no bucket do Cloud Storage criado na etapa anterior.
Conceda à conta de serviço o papel Invocador do Cloud Functions.
String concatenada "TAG_ID: " com o valor de TAGS.TAG[].TAG_ID. Copiado do campo create_time do registro bruto. Codificado como "ASSET". Codificado como "QUALYS ASSET CONTEXT". Codificado como "QUALYS ASSET CONTEXT". Codificado como "RESOURCE". Codificado como "MEMBER". Copiado do campo create_time do registro bruto.
[[["Fácil de entender","easyToUnderstand","thumb-up"],["Meu problema foi resolvido","solvedMyProblem","thumb-up"],["Outro","otherUp","thumb-up"]],[["Difícil de entender","hardToUnderstand","thumb-down"],["Informações incorretas ou exemplo de código","incorrectInformationOrSampleCode","thumb-down"],["Não contém as informações/amostras de que eu preciso","missingTheInformationSamplesINeed","thumb-down"],["Problema na tradução","translationIssue","thumb-down"],["Outro","otherDown","thumb-down"]],["Última atualização 2025-08-21 UTC."],[],[],null,["# Collect Qualys asset context logs\n=================================\n\nSupported in: \nGoogle secops [SIEM](/chronicle/docs/secops/google-secops-siem-toc)\n| **Note:** This feature is covered by [Pre-GA Offerings Terms](https://chronicle.security/legal/service-terms/) of the Google Security Operations Service Specific Terms. Pre-GA features might have limited support, and changes to pre-GA features might not be compatible with other pre-GA versions. For more information, see the [Google SecOps Technical Support Service guidelines](https://chronicle.security/legal/technical-support-services-guidelines/) and the [Google SecOps Service Specific Terms](https://chronicle.security/legal/service-terms/).\n\nThis parser extracts asset context information from Qualys JSON logs and transforms it into the UDM format. It parses various fields such as ID, IP, hostname, cloud resource details, OS, and tags, mapping them to corresponding UDM fields and creating relationships between assets and resources. The parser also handles specific logic for cloud providers and operating systems, ensuring accurate representation in the UDM.\n\nBefore you begin\n----------------\n\nEnsure that you have the following prerequisites:\n\n- Google Security Operations instance.\n- Privileged access to Google Cloud.\n- Privileged access to Qualys.\n\nEnable Required APIs:\n---------------------\n\n1. Sign in to the Google Cloud console.\n2. Go to **APIs \\& Services** \\\u003e **Library**.\n3. Search for the following APIs and enable them:\n - Cloud Functions API\n - Cloud Scheduler API\n - Cloud Pub/Sub (required for Cloud Scheduler to invoke functions)\n\nCreate a Google Cloud Storage Bucket\n------------------------------------\n\n1. Sign in to the Google Cloud console.\n2. Go to the **Cloud Storage Buckets** page.\n\n [Go to Buckets](https://console.cloud.google.com/storage/browser)\n3. Click **Create**.\n\n4. Configure the bucket:\n\n - **Name** : enter a unique name that meets the bucket name requirements (for example, **qualys-asset-bucket**).\n - **Choose where to store your data**: select a location.\n - **Choose a storage class for your data** : either select a **default storage class** for the bucket, or select **Autoclass** for automatic storage class management.\n - **Choose how to control access to objects** : select **not** to enforce **public access prevention** , and select an **access control model** for your bucket's objects.\n\n | **Note:** If public access prevention is already enforced by your project's organization policy, the **Prevent public access** checkbox is locked.\n - **Storage class** : choose based on your needs (for example, **Standard**).\n5. Click **Create**.\n\n| **Note:** Do not set a retention policy, as the last data entry may need to be overwritten in case of a timeout.\n\nCreate a Google Cloud Service Account\n-------------------------------------\n\n1. Go to to **IAM \\& Admin** \\\u003e **Service Accounts**.\n2. Create a new service account.\n3. Give it a descriptive name (for example, **qualys-user**).\n4. Grant the service account with **Storage Object Admin** role on the Cloud Storage bucket you created in the previous step.\n5. Grant the service account with **Cloud Functions Invoker** role.\n6. Create an [**SSH key**](/iam/docs/keys-create-delete) for the service account.\n7. Download a JSON key file for the service account. Keep this file secure.\n\nOptional: Create a dedicated API User in Qualys\n-----------------------------------------------\n\n1. Sign in to the Qualys console.\n2. Go to **Users**.\n3. Click **New** \\\u003e **User**.\n4. Enter the **General Information** required for the user.\n5. Select the **User Role** tab.\n6. Make sure the role has the **API Access** checkbox selected.\n7. Click **Save**.\n\nIdentify your specific Qualys API URL\n-------------------------------------\n\n### Option 1\n\nIdentify your URLs as mentioned in the [platform identification](https://www.qualys.com/platform-identification).\n\n### Option 2\n\n1. Sign in to the Qualys console.\n2. Go to **Help** \\\u003e **About**.\n3. Scroll to see this information under Security Operations Center (SOC).\n4. Copy the Qualys API URL.\n\nConfigure the Cloud Function\n----------------------------\n\n1. Go to **Cloud Functions** in the Google Cloud console.\n2. Click **Create Function**.\n3. Configure the Function:\n\n - **Name** : enter a name for your function (for example, **fetch-qualys-assets**).\n - **Region**: select a region close to your Bucket.\n - **Trigger**: choose HTTP trigger if needed or Cloud Pub/Sub for scheduled execution.\n - **Authentication**: secure with authentication.\n - **Write the Code** with an inline editor:\n\n **Note:** Replace the following placeholders with your specific data values: `\u003cbucket-name\u003e`, `\u003cqualys-username\u003e`, `\u003cqualys-password\u003e`, `\u003cqualys_base_url\u003e`. \n\n ```python\n from google.cloud import storage\n import requests\n import base64\n import json\n\n # Cloud Storage configuration\n BUCKET_NAME = \"\u003cbucket-name\u003e\"\n FILE_NAME = \"qualys_assets.json\"\n\n # Qualys API credentials\n QUALYS_USERNAME = \"\u003cqualys-username\u003e\"\n QUALYS_PASSWORD = \"\u003cqualys-password\u003e\"\n QUALYS_BASE_URL = \"https://\u003cqualys_base_url\u003e\"\n\n def fetch_qualys_assets():\n auth = base64.b64encode(f\"{QUALYS_USERNAME}:{QUALYS_PASSWORD}\".encode()).decode()\n headers = {\n \"Authorization\": f\"Basic {auth}\",\n \"Content-Type\": \"application/xml\"\n }\n payload = \"\"\"\n \u003cServiceRequest\u003e\n \u003cfilters\u003e\n \u003cCriteria field=\"asset.name\" operator=\"LIKE\"\u003e%\u003c/Criteria\u003e\n \u003c/filters\u003e\n \u003c/ServiceRequest\u003e\n \"\"\"\n response = requests.post(f\"{QUALYS_BASE_URL}/qps/rest/2.0/search/am/asset\", headers=headers, data=payload)\n return response.json()\n\n def upload_to_gcs(data):\n client = storage.Client()\n bucket = client.get_bucket(BUCKET_NAME)\n blob = bucket.blob(FILE_NAME)\n blob.upload_from_string(json.dumps(data), content_type=\"application/json\")\n\n def main(request):\n assets = fetch_qualys_assets()\n upload_to_gcs(assets)\n return \"Data uploaded to Cloud Storage successfully!\"\n\n ```\n\n4. Click **Deploy** after completing the configuration.\n\nConfigure Cloud Scheduler\n-------------------------\n\n1. Go to **Cloud Scheduler** in the Google Cloud console.\n2. Click **Create Job**.\n3. Configure the Job:\n\n - **Name** : enter a name for your job (for example, **trigger-fetch-qualys-assets**).\n - **Frequency** : use **cron** syntax to specify the schedule (for example, 0 0 \\* \\* \\* for daily at midnight).\n - **Time Zone**: set your preferred time zone.\n - **Trigger Type** : Choose **HTTP**.\n - **Trigger URL**: Enter the Cloud Function's URL (found in the function details after deployment).\n - **Method** : Choose **POST**.\n\n | **Note:** If authentication is enabled for the function, select **Service Account** and ensure the account has the **Cloud Functions Invoker** role.\n4. Create the job.\n\nSet up feeds\n------------\n\nTo configure a feed, follow these steps:\n\n1. Go to **SIEM Settings** \\\u003e **Feeds**.\n2. Click **Add New Feed**.\n3. On the next page, click **Configure a single feed**.\n4. In the **Feed name** field, enter a name for the feed; for example, **Qualys Asset Context Logs**.\n5. Select **Google Cloud Storage V2** as the **Source type**.\n6. Select **Qualys Asset Context** as the **Log type**.\n7. Click **Next**.\n8. Specify values for the following input parameters:\n\n - **GCS URI**: the Cloud Storage URI.\n - **Source deletion options**: select the deletion option according to your preference.\n\n | **Note:** If you select the `Delete transferred files` or `Delete transferred files and empty directories` option, make sure that you granted appropriate permissions to the service account. \\* **Maximum File Age**: Includes files modified in the last number of days. Default is 180 days.\n9. Click **Next**.\n\n10. Review your new feed configuration in the **Finalize** screen, and then click **Submit**.\n\nUDM Mapping Table\n-----------------\n\n**Need more help?** [Get answers from Community members and Google SecOps professionals.](https://security.googlecloudcommunity.com/google-security-operations-2)"]]