Data Export API

Supported in:

The Google Security Operations Data Export API enables customers to export raw log data from their Google Security Operations account to their Google Cloud Storage buckets and manage existing export requests.

The API is designed for exporting data that exists in a Google Security Operations instance at a point in time, such as for an audit, as opposed to exporting legacy data.

You can export a maximum of 10 TB per CreateDataExport request. This data is stored compressed (the 10TB limit), but is exported uncompressed. The size of the exported data may not match this 10TB limit, but transfers that exceed it must be split.

Before exporting data from Google Security Operations, customers must create their own Google Cloud Storage bucket (make sure that the bucket is not publicly accessible) and grant malachite-data-export-batch@prod.google.com the following roles for that Google Cloud Storage bucket:

Use the Google Cloud console or command-line tool to issue the following commands:

gcloud storage buckets add-iam-policy-binding gs://<your-bucket-name> --member=user:malachite-data-export-batch@prod.google.com --role=roles/storage.objectAdmin
gcloud storage buckets add-iam-policy-binding gs://<your-bucket-name> --member=user:malachite-data-export-batch@prod.google.com --role=roles/storage.legacyBucketReader

How to authenticate with the Google Security Operations API

This Cloud Storage API uses the OAuth 2.0 protocol for authentication and authorization. Your application can complete these tasks using either of the following implementations:

  • Using the Google API Client Library for your computer language.

  • Directly interfacing with the OAuth 2.0 system using HTTP.

See the reference documentation for the Google Authentication library in Python.

Google Authentication libraries are a subset of the Google API client libraries. See other language implementations.

Getting API authentication credentials

Your Cloud Storage representative will provide you with a Google Developer Service Account Credential to enable the API client to communicate with the API.

You also must provide the Auth Scope when initializing your API client. OAuth 2.0 uses a scope to limit an application's access to an account. When an application requests a scope, the access token issued to the application is limited to the scope granted.

Use the following scope to initialize your Google API client:

https://www.googleapis.com/auth/chronicle-backstory

Python example

The following Python example demonstrates how to use the OAuth2 credentials and HTTP client using google.oauth2 and googleapiclient.

# Imports required for the sample - Google Auth and API Client Library Imports.
# Get these packages from https://pypi.org/project/google-api-python-client/ or run $ pip
# install google-api-python-client from your terminal
from google.oauth2 import service_account
from googleapiclient import _auth

SCOPES = ['https://www.googleapis.com/auth/chronicle-backstory']

# The apikeys-demo.json file contains the customer's OAuth 2 credentials.
# SERVICE_ACCOUNT_FILE is the full path to the apikeys-demo.json file
# ToDo: Replace this with the full path to your OAuth2 credentials
SERVICE_ACCOUNT_FILE = '/customer-keys/apikeys-demo.json'

# Create a credential using Google Developer Service Account Credential and Cloud Storage API
# Scope.
credentials = service_account.Credentials.from_service_account_file(SERVICE_ACCOUNT_FILE, scopes=SCOPES)

# Build an HTTP client to make authorized OAuth requests.
http_client = _auth.authorized_http(credentials)

# <your code continues here>

Data Export API reference

The following sections describe the Google Security Operations Data Export API methods.

Note: All requests must be made using authenticated Google API client libraries as described in How to Authenticate with the Chronicle API. All responses are provided in JSON.

CreateDataExport

Creates a new data export. You can export a maximum of 10 TB of compressed data per request. A maximum of three requests can be in process at any time. Depending on its size and start time, a data export typically takes several minutes to several hours to complete.

Note: CreateDataExport uses the POST method.

Request

Request Body
{
  "startTime": "Start, inclusive time from the time range",
  "endTime": "Last, exclusive time from the time range",
  "logType": "An individual log type or 'ALL_TYPES' for all log types",
  "gcsBucket": "Path to the customer-provided Google Cloud Storage bucket in projects/<project-id>/buckets/<bucket-name>" format,
}
Parameters
Parameter Name Type Description
startTime google.protobuf.Timestamp (Optional): Start, inclusive time from the time range.

If not specified, the value is UNIX epoch time starting on January 1st, 1970 at UTC.

endTime google.protobuf.Timestamp (Optional): Last, exclusive time from the time range.

If not specified, the value is the current timestamp.

logType string (Required): Individual log type or ALL_TYPES for all log types.
gcsBucket string (Required): Path to the customer-provided Google Cloud Storage bucket in: \ projects/<project-id>/buckets/ \ <bucket-name>" format
Sample Request
https://backstory.googleapis.com/v1/tools/dataexport
{
  "startTime": "2020-03-01T00:00:00Z",
  "endTime": "2020-03-15T00:00:00Z",
  "logType": "CB_EDR",
  "gcsBucket": "projects/chronicle-test/buckets/dataexport-test-bucket"
}
Sample Response
{
  "dataExportId": "d828bcec-21d3-4ecd-910e-0a934f0bd074",
  "startTime": "2020-03-01T00:00:00Z",
  "endTime": "2020-03-15T00:00:00Z",
  "logType": "CB_EDR",
  "gcsBucket": "projects/chronicle-test/buckets/dataexport-test-bucket",
  "dataExportStatus": {"stage": "IN_QUEUE"}
}

GetDataExport

Returns an existing data export.

Note: GetDataExport uses the GET method.

Request

GET https://backstory.googleapis.com/v1/tools/dataexport/{data_export_id}
Parameters
Parameter Name Type Description
dataExportId string UUID representing the data export request.
Sample Request
https://backstory.googleapis.com/v1/tools/dataexport/{data_export_id}
Sample Response
{
  "dataExportId": "d828bcec-21d3-4ecd-910e-0a934f0bd074",
  "startTime": "2020-03-01T00:00:00Z",
  "endTime": "2020-03-15T00:00:00Z",
  "logType": "CB_EDR",
  "gcsBucket": "projects/chronicle-test/buckets/dataexport-test-bucket",
  "dataExportStatus": {"stage": "IN_QUEUE"}
}

CancelDataExport

Cancels an existing data export request.

Note: CancelDataExport uses the POST method. Only IN_QUEUE data exports can be canceled.

Request

POST https://backstory.googleapis.com/v1/tools/dataexport/{data_export_id}:cancel
Request Body
{
  "dataExportId": "The UUID representing the data export request to be canceled"
}
Parameters
Parameter Name Type Description
dataExportId string UUID representing the data export request to be canceled.
Sample Request
https://backstory.googleapis.com/v1/tools/dataexport/d828bcec-21d3-4ecd-910e-0a934f0bd074:cancel
Sample Response
{
  "dataExportId": "d828bcec-21d3-4ecd-910e-0a934f0bd074",
  "startTime": "2020-03-01T00:00:00Z",
  "endTime": "2020-03-15T00:00:00Z",
  "logType": "CB_EDR",
  "gcsBucket": "projects/chronicle-test/buckets/dataexport-test-bucket",
  "dataExportStatus": {"stage": "CANCELLED"}
}

ListAvailableLogTypes

List all available log types and their time range.

Note: ListAvailableLogTypes uses the GET method.

Request

GET https://backstory.googleapis.com/v1/tools/dataexport/listavailablelogtypes 
Parameters
Parameter Name Type Description
startTime google.protobuf.Timestamp (Optional): Start, inclusive time from the time range.

If not specified, the value is UNIX epoch time starting on January 1st, 1970 at UTC.

endTime google.protobuf.Timestamp (Optional): Last, exclusive time from the time range.

If not specified, the value is the current timestamp.

Sample Request
https://backstory.googleapis.com/v1/tools/dataexport/listavailablelogtypes
{
 "startTime": "2020-01-01T00:00:00Z",
 "endTime": "2021-01-01T00:00:00Z"
}
Sample Response
{
'availableLogTypes': [{'logType': 'ACALVIO', 'startTime': '2020-03-02T02:00:00Z', 'endTime': '2020-08-02T11:00:00Z'}, {'logType': 'AZURE_AD', 'startTime': '2020-02-10T22:00:00Z', 'endTime': '2020-02-13T02:00:00Z'}]
}