Dokumen ini menjelaskan cara menyerap log Audit Admin Atlassian Cloud ke
Google Security Operations menggunakan AWS S3. Parser pertama-tama mencoba memproses
pesan masuk sebagai objek JSON. Jika gagal, parser akan menggunakan ekspresi reguler (pola Grok) untuk mengekstrak kolom dari berbagai format log Atlassian Jira, yang pada akhirnya memetakan data yang diekstrak ke model data terpadu (UDM).
Sebelum memulai
Pastikan Anda memenuhi prasyarat berikut:
Instance Google SecOps
Akses istimewa ke AWS
Akses istimewa ke Atlassian
Mengonfigurasi AWS IAM dan Bucket S3
Buat bucket Amazon S3 dengan mengikuti panduan pengguna ini: Membuat bucket
Login ke Konsol AWS.
Buka S3 > Create bucket.
Berikan nama untuk bucket (misalnya, atlassian-admin-audit-logs).
Biarkan setelan default lainnya (atau konfigurasi enkripsi dan pembuatan versi jika diperlukan).
Klik Buat.
Simpan Nama dan Region bucket untuk referensi di masa mendatang.
[[["Mudah dipahami","easyToUnderstand","thumb-up"],["Memecahkan masalah saya","solvedMyProblem","thumb-up"],["Lainnya","otherUp","thumb-up"]],[["Sulit dipahami","hardToUnderstand","thumb-down"],["Informasi atau kode contoh salah","incorrectInformationOrSampleCode","thumb-down"],["Informasi/contoh yang saya butuhkan tidak ada","missingTheInformationSamplesINeed","thumb-down"],["Masalah terjemahan","translationIssue","thumb-down"],["Lainnya","otherDown","thumb-down"]],["Terakhir diperbarui pada 2025-09-04 UTC."],[],[],null,["# Collect Atlassian Cloud Admin Audit logs\n========================================\n\nSupported in: \nGoogle secops [SIEM](/chronicle/docs/secops/google-secops-siem-toc)\n| **Note:** This feature is covered by [Pre-GA Offerings Terms](https://chronicle.security/legal/service-terms/) of the Google Security Operations Service Specific Terms. Pre-GA features might have limited support, and changes to pre-GA features might not be compatible with other pre-GA versions. For more information, see the [Google SecOps Technical Support Service guidelines](https://chronicle.security/legal/technical-support-services-guidelines/) and the [Google SecOps Service Specific Terms](https://chronicle.security/legal/service-terms/).\n\nThis document explains how to ingest Atlassian Cloud Admin Audit logs to\nGoogle Security Operations using AWS S3. The parser first attempts to process the\nincoming message as a JSON object. If that fails, it uses regular expressions\n(Grok patterns) to extract fields from various Atlassian Jira log formats,\nultimately mapping the extracted data to the unified data model (UDM).\n\nBefore you begin\n----------------\n\nMake sure you have the following prerequisites:\n\n- Google SecOps instance\n- Privileged access to AWS\n- Privileged access to Atlassian\n\nConfigure AWS IAM and S3 Bucket\n-------------------------------\n\n1. Create an **Amazon S3 bucket** following this user guide: [Creating a bucket](https://docs.aws.amazon.com/AmazonS3/latest/userguide/creating-bucket.html)\n2. Sign in to the **AWS Console**.\n3. Go to **S3 \\\u003e Create bucket**.\n4. Provide a name for the bucket (for example, `atlassian-admin-audit-logs`).\n5. Leave other defaults (or configure encryption and versioning if required).\n6. Click **Create**.\n7. Save the bucket **Name** and **Region** for future reference.\n8. Create a **User** following this user guide: [Creating an IAM user](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_users_create.html#id_users_create_console).\n9. Select the created **User**.\n10. Select the **Security credentials** tab.\n11. Click **Create Access Key** in the **Access Keys** section.\n12. Select **Third-party service** as **Use case**.\n13. Click **Next**.\n14. Optional: Add a description tag.\n15. Click **Create access key**.\n16. Click **Download CSV file** and store the **Access ID** and **Secret Access Key** for future reference.\n17. Click **Done**.\n18. In the **Permissions** tab under **Permissions policies** , click **Add permissions**.\n19. Select **Attach policies directly**.\n20. Search for **AmazonS3FullAccess** policy.\n21. Select the policy.\n22. Click **Next**.\n23. Click **Add permissions**.\n\nConfigure API Key in Atlassian\n------------------------------\n\n1. Sign in to [Atlassian](/chronicle/docs/ingestion/default-parsers/admin.atlassian.com).\n2. Go to **Settings \\\u003e API keys**.\n3. Click **Create API key** in the top right.\n4. Provide a unique and descriptive **name** for the Key.\n5. Pick a new expiration date under **Expires on**.\n\n| **Note:** The maximum you can extend your expiration date is up to one year.\n\n1. Click **Create** to save.\n2. Copy and save your **API Key** and **Organization ID**.\n3. Click **Done**.\n\nConfigure the required packages\n-------------------------------\n\n1. Sign in to your log collection host (for example, an **AWS VM**) and run the following to configure AWS credentials:\n\n pip install boto3 requests\n aws configure\n\nCreate Atlassian Log Puller script\n----------------------------------\n\n1. Create the following file by entering `sudo vi area1_to_s3.py` and copy the following code:\n\n - Adjust the following:\n\n #!/usr/bin/env python3\n import os, requests, boto3, datetime\n\n # Settings\n TOKEN = os.environ[\"ATL_TOKEN\"]\n ORG_ID = os.environ[\"ATL_ORG_ID\"]\n AWS_PROFILE = os.getenv(\"AWS_PROFILE\")\n BUCKET = \"atlassian-admin-audit-logs\"\n\n def fetch_events(cursor=None):\n url = f\"https://api.atlassian.com/admin/v1/orgs/{ORG_ID}/events\"\n headers = {\"Authorization\":f\"Bearer {TOKEN}\"}\n params = {\"limit\":100, \"cursor\":cursor} if cursor else {\"limit\":100}\n resp = requests.get(url, headers=headers, params=params)\n resp.raise_for_status()\n return resp.json()\n\n def upload_json(data, filename):\n session = boto3.Session(profile_name=AWS_PROFILE) if AWS_PROFILE else boto3.Session()\n session.client(\"s3\").put_object(Bucket=BUCKET, Key=filename, Body=data, ContentType=\"application/json\")\n print(f\"Uploaded {filename}\")\n\n def main():\n today = datetime.datetime.utcnow().strftime(\"%Y-%m-%d\")\n cursor = None\n count = 0\n while True:\n resp = fetch_events(cursor)\n key = f\"audits/{today}/events_{count}.json\"\n upload_json(resp[\"data\"], key)\n count += 1\n cursor = resp.get(\"links\",{}).get(\"next\")\n if not cursor: break\n\n if __name__==\"__main__\":\n main()\n\n2. Save and exit `vi` by clicking **`esc`** \\\u003e type `:wq`\\*\\*.\n\nStore environment variables\n---------------------------\n\n1. Create a secure file to store environment variables in `/etc/atlassian_audit.env`:\n\n export ATL_TOKEN=\"your_atlassian_key\"\n export ATL_ORG_ID=\"your_org_id\"\n export AWS_PROFILE=\"atlassian-logs\"\n\n2. Make sure the file is secure:\n\n chmod 600 /etc/atlassian_audit.env\n\nAutomate with Cron\n------------------\n\n1. Create a Wrapper script for Cron by running `sudo vi /usr/local/bin/run_atlassian_audit.sh` and then copy the following code:\n\n #!/usr/bin/env bash\n source /etc/atlassian_audit.env\n python3 /opt/scripts/export_atlassian_audit.py\n\n2. Make the file executable:\n\n chmod +x /usr/local/bin/run_atlassian_audit.sh\n\n3. Configure to run daily at 02:00 UTC:\n\n crontab -e\n 0 2 * * * /usr/local/bin/run_atlassian_audit.sh \u003e\u003e /var/log/atl_audit.log 2\u003e&1\n\n**Need more help?** [Get answers from Community members and Google SecOps professionals.](https://security.googlecloudcommunity.com/google-security-operations-2)"]]