Collect MuleSoft Anypoint logs

Supported in:

This document explains how to ingest audit-trail events from MuleSoft Anypoint platform logs to Google Security Operations using AWS S3.

Before you begin

Make sure you have the following prerequisites:

  • Google SecOps instance
  • Privileged access to MuleSoft
  • Privileged access to AWS

Get the MuleSoft Organization ID

  1. Sign in to the Anypoint Platform.
  2. Go to Menu > Access Management.
  3. In the Business Groups table, click your organization's name.
  4. Copy the Organization ID (for example, 0a12b3c4-d5e6-789f-1021-1a2b34cd5e6f).

Configure AWS S3 bucket and IAM for Google SecOps

  1. Create Amazon S3 bucket following this user guide: Creating a bucket.
  2. Save bucket Name and Region for future reference (for example, mulesoft-audit-logs).
  3. Create a User following this user guide: Creating an IAM user.
  4. Select the created User.
  5. Select Security credentials tab.
  6. Click Create Access Key in section Access Keys.
  7. Select Third-party service as Use case.
  8. Click Next.
  9. Optional: Add description tag.
  10. Click Create access key.
  11. Click Download CSV file for save the Access Key and Secret Access Key for future reference.
  12. Click Done.
  13. Select Permissions tab.
  14. Click Add permissions in section Permissions policies.
  15. Select Add permissions.
  16. Select Attach policies directly.
  17. Search for and select the AmazonS3FullAccess policy.
  18. Click Next.
  19. Click Add permissions.

Create the MuleSoft Connected App

  1. Sign in to the Anypoint Platform.
  2. Go to Access Management > Connected Apps > Create App.
  3. Provide the following configuration details:
    • App name: Enter a unique name (for example, Google SecOps export).
    • Select App acts on its own behalf (client credentials).
    • Click Add scopes → Audit Log Viewer → Next.
    • Select every Business Group whose logs you need.
    • Click Next > Add scopes.
  4. Click Save and copy the Client ID and Client Secret.

Configure IAM policy & role for S3 uploads

  1. Policy JSON (replace mulesoft-audit-logs with your bucket name):

    {
      "Version": "2012-10-17",
      "Statement": [
        {
          "Sid": "AllowPutAuditObjects",
          "Effect": "Allow",
          "Action": ["s3:PutObject"],
          "Resource": "arn:aws:s3:::mulesoft-audit-logs/*"
        }
      ]
    }
    
  2. Go to AWS console > IAM > Policies > Create policy > JSON tab.

  3. Copy and paste the policy.

  4. Click Next > Create policy.

  5. Go to IAM > Roles > Create role > AWS service > Lambda.

  6. Attach the newly created policy.

  7. Name the role WriteMulesoftToS3Role and click Create role.

Create the Lambda function

Setting Value
Name mulesoft_audit_to_s3
Runtime Python 3.13
Architecture x86_64
Execution role Use existing > WriteMulesoftToS3Role
  1. After the function is created, open the Code tab, delete the stub and enter the following code (mulesoft_audit_to_s3.py).

    #!/usr/bin/env python3
    
    import os, json, gzip, io, uuid, datetime as dt, urllib.request, urllib.error, urllib.parse
    import boto3
    
    ORG_ID        = os.environ["MULE_ORG_ID"]
    CLIENT_ID     = os.environ["CLIENT_ID"]
    CLIENT_SECRET = os.environ["CLIENT_SECRET"]
    S3_BUCKET     = os.environ["S3_BUCKET_NAME"]
    
    TOKEN_URL = "https://anypoint.mulesoft.com/accounts/api/v2/oauth2/token"
    QUERY_URL = f"https://anypoint.mulesoft.com/audit/v2/organizations/{ORG_ID}/query"
    
    def http_post(url, data, headers=None):
        raw = json.dumps(data).encode() if headers else urllib.parse.urlencode(data).encode()
        req = urllib.request.Request(url, raw, headers or {})
        try:
            with urllib.request.urlopen(req, timeout=30) as r:
                return json.loads(r.read())
        except urllib.error.HTTPError as e:
            print("MuleSoft error body →", e.read().decode())
            raise
    
    def get_token():
        return http_post(TOKEN_URL, {
            "grant_type": "client_credentials",
            "client_id":  CLIENT_ID,
            "client_secret": CLIENT_SECRET
        })["access_token"]
    
    def fetch_audit(token, start, end):
        headers = {
            "Authorization": f"Bearer {token}",
            "Content-Type":  "application/json"
        }
        body = {
            "startDate": f"{start.isoformat(timespec='milliseconds')}Z",
            "endDate":   f"{end.isoformat(timespec='milliseconds')}Z",
            "limit": 200,
            "offset": 0,
            "ascending": False
        }
        while True:
            data = http_post(QUERY_URL, body, headers)
            if not data.get("data"):
                break
            yield from data["data"]
            body["offset"] += body["limit"]
    
    def upload(events, ts):
        key = f"{ts:%Y/%m/%d}/mulesoft-audit-{uuid.uuid4()}.json.gz"
        buf = io.BytesIO()
        with gzip.GzipFile(fileobj=buf, mode="w") as gz:
            for ev in events:
                gz.write((json.dumps(ev) + "\n").encode())
        buf.seek(0)
        boto3.client("s3").upload_fileobj(buf, S3_BUCKET, key)
    
    def lambda_handler(event=None, context=None):
        now   = dt.datetime.utcnow().replace(microsecond=0)
        start = now - dt.timedelta(days=1)
    
        token  = get_token()
        events = list(fetch_audit(token, start, now))
    
        if events:
            upload(events, start)
            print(f"Uploaded {len(events)} events")
        else:
            print("No events in the last 24 h")
    
    # For local testing
    if __name__ == "__main__":
        lambda_handler()
    
  2. Go to Configuration > Environment variables > Edit > Add new environment variable.

  3. Enter the following environment variables provided, replacing with your value.

    Key Example value
    MULE_ORG_ID your_org_id
    CLIENT_ID your_client_id
    CLIENT_SECRET your_client_secret
    S3_BUCKET_NAME mulesoft-audit-logs

Schedule the Lambda function (EventBridge Scheduler)

  1. Go to Configuration > Triggers > Add trigger > EventBridge Scheduler > Create rule.
  2. Provide the following configuration details:
    • Name: daily-mulesoft-audit export.
    • Schedule pattern: Cron expression.
    • Expression: 0 2 * * * (runs daily at 02:00 UTC).
  3. Leave the rest as default and click Create.

Configure a feed in Google SecOps to ingest the MuleSoft logs

  1. Go to SIEM Settings > Feeds.
  2. Click Add new.
  3. In the Feed name field, enter a name for the feed (for example, MuleSoft Logs).
  4. Select Amazon S3 V2 as the Source type.
  5. Select Mulesoft as the Log type.
  6. Click Next.
  7. Specify values for the following input parameters:

    • S3 URI: The bucket URI
      • s3://mulesoft-audit-logs/
        • Replace mulesoft-audit-logs with the actual name of the bucket.
    • Source deletion options: select the deletion option according to your preference.

    • Maximum File Age: Include files modified in the last number of days. Default is 180 days.

    • Access Key ID: User access key with access to the s3 bucket.

    • Secret Access Key: User secret key with access to the s3 bucket.

    • Asset namespace: The asset namespace.

    • Ingestion labels: The label to be applied to the events from this feed.

  8. Click Next.

  9. Review your new feed configuration in the Finalize screen, and then click Submit.

Need more help? Get answers from Community members and Google SecOps professionals.