Data Export API (Enhanced)
Important: After you enable the new enhanced API, you can't use the API to access your old, existing jobs.
The Data Export API facilitates the bulk export of your security data from Google Security Operations to a Google Cloud Storage bucket that you control. This feature addresses critical long-term data retention and supports historical forensic analysis and strict compliance requirements (such as, SOX, HIPAA, GDPR).
The Data Export API provides a scalable and reliable solution for point-in-time data exports and handles requests of up to 100 TB.
As a managed pipeline, it offers essential enterprise-grade features, including:
- Automated retries on transient errors.
- Comprehensive job status monitoring.
- A full audit trail for each export job.
The API logically partitions the exported data by date and time within your Google Cloud Storage bucket.
This feature lets you build large-scale data offloading workflows. Google SecOps manages the export process complexity to provide stability and performance.
Key benefits
The Data Export API provides a resilient and auditable solution for managing the lifecycle of your security data.
- Reliability: The service handles large-scale data transfers. The system
uses an exponential backoff strategy to automatically retry export jobs that
encounter transient issues (for example, temporary network problems), making
it resilient. If your export job fails due to a transient error, it
automatically retries several times. If a job fails permanently after all
retries, the system updates its status to
FINISHED_FAILURE
, and the API response for that job contains a detailed error message that explains the cause. Comprehensive auditability: To meet strict compliance and security standards, the system captures every action related to an export job in an immutable audit trail. This trail includes the creation, start, success, or failure of every job, along with the user who initiated the action, a timestamp, and the job parameters.
Optimized for performance and scale: The API uses a robust job management system. This system includes queuing and prioritization to provide platform stability and prevent any single tenant from monopolizing resources.
Enhanced data integrity and accessibility: The system automatically organizes data into a logical directory structure within your Google Cloud Storage bucket, which helps you locate and query specific time windows for historical analysis.
Key terms and concepts
- Export job: A single, asynchronous operation to export a specific time
range of log data to a Google Cloud Storage bucket. The system tracks each
job with a unique
dataExportId
. - Job status: The current state of an export job in its lifecycle (for
example,
IN_QUEUE
,PROCESSING
,FINISHED_SUCCESS
). - Google Cloud Storage bucket: A user-owned Google Cloud Storage bucket that serves as the destination for the exported data.
- Log types: These are the specific categories of logs you can export (for
example,
NIX_SYSTEM
,WINDOWS_DNS
,CB_EDR
). For more details, see the list of all supported log types.
Understand the exported data structure
When a job completes successfully, the system writes the data to your Google Cloud Storage bucket. It uses a specific, partitioned directory structure to simplify data access and querying.
Directory path structure: gs://<gcs-bucket-name>/<export-job-name>/<logtype>/<event-time-bucket>/<epoch_execution_time>/<file-shard-name>.csv
- gcs-bucket-name: The name of your Google Cloud Storage bucket.
- export-job-name: The unique name of your export job.
- logtype: The name of the log type for the exported data.
event-time-bucket: The hour range of the event timestamps of exported logs.
The format is a UTC timestamp:
year/month/day/UTC-timestamp
(whereUTC-timestamp
ishour/minute/second
).
For example,2025/08/25/01/00/00
refers toUTC 01:00:00 AM, August 25, 2025
.epoch-execution-time: The Unix epoch time value, indicating when the export job began.
file-shard-name: The name of the sharded files containing raw logs. Each file shard has an upper file size limit of 100 MB.
Performance and limitations
The service has specific limits to ensure platform stability and fair resource allocation.
- Maximum data volume per job: Each individual export job can request up to 100 TB of data. For larger datasets, we recommend breaking the export into multiple jobs with smaller time ranges.
- Concurrent jobs: Each customer tenant can run or queue a maximum of 3 export jobs concurrently. The system rejects any new job creation request that exceeds this limit.
- Job completion times: The volume of exported data determines job completion times. A single job can take up to 18 hours.
- Export format and data scope: This API supports bulk, point-in-time
exports, with the following limitations and features:
- Raw logs only: You can only export raw logs, (not UDM logs, UDM events, or detections). To export UDM data, see Configure data export to BigQuery in a self-managed Google Cloud project.
- Data compression: The API exports data as uncompressed text.
Prerequisites and architecture
This section outlines the system architecture and necessary requirements for using the Data Export API and details the system architecture. Use this information to verify that your environment is correctly configured.
Before you begin
Before using the Data Export API, complete these prerequisite steps to set up your Google Cloud Storage destination and grant the necessary permissions.
Grant permissions to the API user: To use the Data Export API, you need the following IAM roles.
Chronicle administrator (creating/managing jobs)
: Grants full permissions to create, update, cancel, and view export jobs using the API.Chronicle Viewer
: Grants read-only access to view job configurations and history using the API.
Create a Google Cloud Storage bucket: In your Google Cloud project, create a new Google Cloud Storage bucket (the destination for your exported data) in the same region as your Google SecOps tenant. Make it private to prevent unauthorized access. For details, see Create a bucket.
Grant permissions to the Service Account: Grant the Google SecOps Service Account, which is linked to your Google SecOps tenant, the necessary IAM roles to write data to your bucket.
Call the
FetchServiceAccountForDataExport
API endpoint to identify your Google SecOps instance's unique Service Account. The API returns the Service Account email.Example request:
{ "parent": "projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee" }
Example Response:
{ "service_account_email": "service-1234@gcp-sa-chronicle.iam.gserviceaccount.com" }
Grant the Google SecOps Service Account principal the following IAM role for the destination Google Cloud Storage bucket: This role lets the Google SecOps service write exported data files to your Google Cloud Storage bucket.
Storage object administrator (roles/storage.objectAdmin)
Legacy bucket reader (roles/storage.legacyBucketReader)
For details, see Grant access to the Google SecOps Service Account.
Complete authentication: The Data Export API authenticates your calls. To set up this authentication, follow the instructions in the following sections:
Key use cases
The Data Export API provides a suite of endpoints to create data export jobs and manage the entire lifecycle of bulk data export. You perform all interactions using API calls.
The following use cases describe how to create, monitor, and manage data export jobs.
Core workflow
This section explains how to manage the lifecycle of your export jobs.
Create a new data export job
The system stores data export job specifications on the parent resource Google SecOps instance. This instance is the source of the log data for the export job.
Identify the unique Service Account for your Google SecOps instance. For details, see FetchServiceAccountForDataExports.
To start a new export, send a
POST
request to thedataExports.create
endpoint.
For details, seeCreateDataExport
endpoint.
Monitor data export job status
View data export job details and status for a specific export job, or set a filter to view certain types of jobs.
To view a specific export job, see GetDataExport.
To list certain types of data export jobs using a filter, see ListDataExport.
Manage queued jobs
You can modify or cancel a job when it is in the IN_QUEUE
status.
To change parameters (such as the time range, list of log types, or the destination bucket), see UpdateDataExport.
To cancel a queued job, see CancelDataExport.
Data Export API reference
After you fulfill the prerequisites, you can begin using Data Export APIs.
The following sections describe the Chronicle Data Export API endpoints.
FetchServiceAccountForDataExports
Use this API endpoint to identify your Google SecOps instance's unique Service Account.
For details, see Method: dataExports.fetchServiceAccountForDataExport
Request
Endpoint:GET https://chronicle.{region}.rep.googleapis.com/v1alpha/{parent}/dataExports:fetchServiceAccountForDataExport
Path parameters
Field | Type | Required | Description |
---|---|---|---|
parent |
string | required | The Google SecOps instance to export data from, specified in the following format:
projects/{project}/locations/{region}/instances/{instance}
where: {project} : Identifier of your project.
{region} : Region where your destination bucket is located. See the list of regions.
{instance} : Identifier of the source Google SecOps instance.
|
Request body
The request body must be empty.
Sample request
GET https://chronicle.us.rep.googleapis.com/v1alpha/projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/dataExports:fetchServiceAccountForDataExport
Sample response
The API returns the Service Account email.
{
"serviceAccountEmail": "service-1234@gcp-sa-chronicle.iam.gserviceaccount.com"
}
Response parameters
Parameter | Type | Description |
---|---|---|
serviceAccountEmail |
string |
The Service Account email linked to your Google SecOps tenant. |
CreateDataExport
Use this endpoint to create a specification for a bulk data export job. The system stores the job specification on the parent resource — the Google SecOps instance containing the source log data.
The API exports data using the First-In, First-Out (FIFO) principle, independent of data size.
For details, see Method: dataExports.create
Request
Endpoint: POST https://chronicle.{region}.rep.googleapis.com/v1alpha/{parent}/dataExports
Path parameters
Field | Type | Required | Description |
---|---|---|---|
parent |
string | required | The Google SecOps instance to export data from, specified in the following format:
projects/{project}/locations/{region}/instances/{instance}
where: {project} : Identifier of your project.
{region} : Region where your destination bucket is located. See the list of regions.
{instance} : Identifier of the source Google SecOps instance.
|
Request body
Post a request to the endpoint using the following parameters.
Sample request
POST https://chronicle.us.rep.googleapis.com/v1alpha/
projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/dataExports
{
"startTime": "2025-08-01T00:00:00Z",
"endTime": "2025-08-02T00:00:00Z",
"gcsBucket": "projects/chronicle-test/buckets/dataexport-test-bucket",
"includeLogTypes": [
"projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/logTypes/GCP_DNS",
"projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/logTypes/GCP_FIREWALL"
]
}
Body parameters
Field | Type | Required | Description |
---|---|---|---|
startTime
|
string | optional | The beginning of the event time range of the data to export, based on the event timestamp.
Format: String in google.protobuf.Timestamp format.
If you don't specify a time, it defaults to 01/01/1970 UTC .
|
endTime
|
string | optional | The end of the event time range of the data to export.
Format: String in google.protobuf.Timestamp format.
If you don't specify a time, it defaults to the current time. |
gcsBucket
|
string | required | The path to your Google Cloud Storage destination bucket, in the following format:
/projects/{project-id}/buckets/{bucket-name} .Note: The destination bucket must be in the same region as the source Google SecOps tenant. |
includeLogTypes |
array | optional | A comma-separated array of one or more log types you want to export. If not specified, the system exports all log types by default. |
Sample response
Upon successful creation of the data export job, the API returns a unique name
for the data export job and the job's initial status, which is IN_QUEUE
. The
response also includes an estimatedVolume
of the data that the system expects to
export.
{
"name": "projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/dataExports/b4a3c2d1-e8f7-6a5b-4c3d-2e1f0a9b8c7d",
"startTime": "2025-08-01T00:00:00Z",
"endTime": "2025-08-02T00:00:00Z",
"gcsBucket": "projects/chronicle-test/buckets/dataexport-test-bucket",
"includeLogTypes": [
"projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/logTypes/GCP_DNS",
"projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/logTypes/GCP_FIREWALL"
],
"dataExportStatus": {
"stage": "IN_QUEUE"
},
"estimatedVolume": "10737418240",
"createTime": "2025-08-13T11:00:00Z",
"updateTime": "2025-08-13T11:00:00Z"
}
Response parameters
Parameter | Type | Description |
---|---|---|
name |
string |
Unique data export job ID.
The system extracts the dataExportId from the last section of the name parameter. Use this UUID in other calls to represent the data export request. |
startTime |
string |
Starting time range.
Format: String in google.protobuf.Timestamp format. |
endTime |
string |
Ending time range.
Format: String in google.protobuf.Timestamp format.
|
gcsBucket |
string |
The path to your Google Cloud Storage destination bucket, in the following format:
/projects/{project-id}/buckets/{bucket-name} .
|
includeLogTypes |
list |
A comma-separated list of log types included. |
dataExportStatus.stage |
string |
The status of the export job at the time of creation (always IN_QUEUE ). |
estimatedVolume |
string |
The estimated export volume in bytes. |
createTime |
string |
Job creation time.
Format: String in google.protobuf.Timestamp format. |
updateTime |
string |
Job update time.
Format: String in google.protobuf.Timestamp format. |
GetDataExport
Retrieve a specific data export job's current status and details, using its dataExportId
.
Request
Endpoint: GET https://chronicle.{region}.rep.googleapis.com/v1alpha/{name}
Path parameters
Field | Type | Required | Description |
---|---|---|---|
name |
string |
required | The name of the data export job to retrieve, in the following format:
projects/{project}/locations/{region}/instances/{instance}/dataexports/{dataExportId}
where: {project} : Identifier of your project.
{region} : Region where your destination bucket is located. See the list of regions.
{instance} : Identifier of the source Google SecOps instance.
{dataExportId} : UUID of the data export job.
|
Request body
The request body must be empty.
Sample request
GET https://chronicle.us.rep.googleapis.com/v1alpha/
projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/dataExports/b4a3c2d1-e8f7-6a5b-4c3d-2e1f0a9b8c7d
Sample response
The response contains the full job details, including its current status. For completed jobs, the response also returns the actual volume of data successfully exported.
{
"name": "projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/dataExports/b4a3c2d1-e8f7-6a5b-4c3d-2e1f0a9b8c7d",
"startTime": "2025-08-01T00:00:00Z",
"endTime": "2025-08-02T00:00:00Z",
"gcsBucket": "projects/chronicle-test/buckets/dataexport-test-bucket",
"includeLogTypes": [
"projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/logTypes/GCP_DNS",
"projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/logTypes/GCP_FIREWALL"
],
"dataExportStatus": {
"stage": "FINISHED_SUCCESS",
"exportedGlobPatterns": [
"/bigstore/<bucket>/<dataexportid>/exported_paths.txt"
]
},
"estimatedVolume": "10737418240",
"exportedVolume": "10938428241",
"createTime": "2025-08-13T11:00:00Z",
"updateTime": "2025-08-13T11:05:00Z"
}
Response parameters
Parameter | Type | Description |
---|---|---|
name |
string |
Unique name for a data export job. |
startTime |
string |
Starting time range. |
endTime |
string |
Ending time range. |
gcsBucket |
string |
The path to your Google Cloud Storage destination bucket, in the following format:
/projects/{project-id}/buckets/{bucket-name} .
|
includeLogTypes |
list |
A comma-separated list of included log types. |
dataExportStatus.stage |
string |
Current status of the data export job, which can have one of the following values:
(For more details, see the DataExportStatus reference.) |
dataExportStatus.exportedGlobPatterns |
list |
File path of the exported text file, containing a list of all the exported file shards (exported glob patterns) created in the destination bucket. |
estimatedVolume |
string |
The estimated export volume in bytes. |
exportedVolume |
string |
For completed jobs, the actual volume of data exported. |
createTime |
string |
Job creation time. |
updateTime |
string |
Job update time. |
ListDataExport
List data export jobs associated with a Google SecOps instance.
You can optionally filter the list to narrow results by the current job
status (dataExportStatus.stage
), the data export job's createTime
, and
the job name
.
Request
Endpoint:GET https://chronicle.{region}.rep.googleapis.com/v1alpha/{parent}/dataExports{?queryParameters}
Path parameters
Field | Type | Required | Description |
---|---|---|---|
parent |
string | required | The Google SecOps instance for which to list data export requests, specified in the following format:
projects/{project}/locations/{region}/instances/{instance}
where: {project} : Identifier of your project.
{region} : Region where your destination bucket is located. See the list of regions.
{instance} : Identifier of the source Google SecOps instance.
|
Query parameters
Append optional query parameters to the request to add filters to narrow the results.
You must encode query parameters in a UTF-8 format.
Field | Type | Required | Description |
---|---|---|---|
pageSize
|
integer
|
optional | The maximum number of export jobs to return. The response may return fewer results. If you don't specify it, the API returns a list of 10 jobs by default. The API can return a maximum of 100 jobs in a single request. |
pageToken
|
string
|
optional | A string value that the API returns in a paginated response, which you can use to retrieve the subsequent page. |
filter
|
string
|
optional | You can apply the following filters to the list of jobs:
|
Example of query parameters in a UTF-8 encoded format
filter=(dataExportStatus.stage%3D%22FINISHED_SUCCESS%22
%20OR%20dataExportStatus.stage%3D%22CANCELLED%22)
%20AND%20createTime%3E%3D%222025-08-29T00%3A00%3A00Z%22
%20AND%20createTime%3C%3D%222025-09-09T00%3A00%3A00Z%22
%20AND%20name%3D%22projects%2F140410331797
%2Flocations%2Fus
%2Finstances%2Febdc4bb9-878b-11e7-8455-10604b7cb5c1
%2FdataExports%2Fed3f735d-3347-439a-9161-1d474407eae2
%22&pageSize=2
The example uses the following parameters:
Parameters | Description |
---|---|
filter=
|
This introduces filters. |
(data_export_status.stage%3D%22FINISHED_SUCCESS%22%20OR%20data_export_status.stage%3D%22CANCELLED%22)
|
This creates a filter to only include jobs with the "dataExportStatus.stage" values as "FINISHED_SUCCESS" OR "CANCELLED". |
%20AND%20
|
This acts as a separator to introduce another filter using the AND operator.
|
createTime%3E%3D%222025-08-29T00%3A00%3A00Z%22
|
This creates a filter for createTime to be greater than or equal to "2025-08-29T00:00:00Z".
|
createTime%3C%3D%222025-09-09T00%3A00%3A00Z%22
|
This creates a filter for createTime to be less than or equal to "2025-08-29T00:00:00Z".
|
name%3D%22projects%2F140410331797%2Flocations%2Fus%2Finstances%2Febdc4bb9-878b-11e7-8455-10604b7cb5c1%2FdataExports%2Fed3f735d-3347-439a-9161-1d474407eae2%22
|
This creates a filter to restrict results to the job name: "projects/140410331797/locations/us/instances/ebdc4bb9-878b-11e7-8455-10604b7cb5c1/dataExports/ed3f735d-3347-439a-9161-1d474407eae2". |
&pageSize=2
|
This specifies the required page size. |
Request body
The request body must be empty.
Sample request
Send a request using optional query parameters to add filters to narrow the results, for example, the current job status dataExportStatus.stage
, the createTime
of the data export job, and the job name
.
GET https://chronicle.googleapis.com/v1alpha/
projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/dataExports?filter=(dataExportStatus.stage%3D%22FINISHED_SUCCESS%22%20OR%20dataExportStatus.stage%3D%22CANCELLED%22)%20AND%20createTime%3E%3D%222025-08-29T00%3A00%3A00Z%22%20AND%20createTime%3C%3D%222025-09-09T00%3A00%3A00Z%22%20AND%20name%3D%22projects%2F140410331797%2Flocations%2Fus%2Finstances%2Febdc4bb9-878b-11e7-8455-10604b7cb5c1%2FdataExports%2Fed3f735d-3347-439a-9161-1d474407eae2%22&pageSize=2
Sample response
The response returns a paginated array of data export job objects that match the filter criteria. Each object contains full job details and the current job status. Completed jobs include the actual volume of data successfully exported.
{
"dataExports": [
{
"name": "projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/dataExports/b4a3c2d1-e8f7-6a5b-4c3d-2e1f0a9b8c7d",
"startTime": "2025-08-01T00:00:00Z",
"endTime": "2025-08-03T00:00:00Z",
"gcsBucket": "projects/chronicle-test/buckets/dataexport-test-bucket",
"includeLogTypes": [
"projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/logTypes/GCP_DNS",
"projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/logTypes/GCP_FIREWALL"
],
"dataExportStatus": {
"stage": "CANCELLED"
},
"estimatedVolume": "10737418240",
"createTime": "2025-08-01T11:00:00Z",
"updateTime": "2025-08-13T11:10:00Z"
},
{
"name": "projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/dataExports/f1e2d3c4-b5a6-7890-1234-567890abcdef",
"startTime": "2025-08-03T00:00:00Z",
"endTime": "2025-08-04T00:00:00Z",
"gcsBucket": "projects/chronicle-test/buckets/dataexport-test-bucket",
"dataExportStatus": {
"stage": "FINISHED_SUCCESS",
"exportedGlobPatterns": [
"/bigstore/<bucket>/<dataexportid>/exported_paths.txt"
]
},
"estimatedVolume": "53687091200",
"exportedVolume": "54687091205",
"createTime": "2025-08-01T09:00:00Z",
"updateTime": "2025-08-13T10:30:00Z"
}
],
"nextPageToken": "aecg2S1w"
}
Response parameters
Parameter | Type | Description |
---|---|---|
dataExports |
array |
Array of data export job objects that match the specified filters:
|
nextPageToken |
string |
A token (string) used to retrieve the subsequent page in a different request. |
UpdateDataExport
You can only modify an existing job's parameters when it's in the IN_QUEUE
status.
Request
Endpoint:PATCH https://chronicle.{region}.rep.googleapis.com/v1alpha/{parent}/dataExports/{dataExportId}
Path parameters
Field | Type | Required | Description |
---|---|---|---|
parent |
string |
required | The parent resource containing this data export specification, in the following format:
projects/{project}/locations/{region}/instances/{instance}
where: {project} : Identifier of your project.
{region} : Region where your destination bucket is located. See the list of regions.
{instance} : Identifier of the source Google SecOps instance.
|
dataExportId |
string | required | The job UUID to update. |
Request body
Send a PATCH
request specifying the job's name
.
Sample request
PATCH https://chronicle.us.rep.googleapis.com/v1alpha/
projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/dataExports/b4a3c2d1-e8f7-6a5b-4c3d-2e1f0a9b8c7d
{
"name": "projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/dataExports/b4a3c2d1-e8f7-6a5b-4c3d-2e1f0a9b8c7d",
"endTime": "2025-08-03T00:00:00Z",
"gcsBucket": "projects/chronicle-test/buckets/dataexport-test-bucket2"
}
Body parameters
Field | Type | Required | Description |
---|---|---|---|
name |
string |
required | Unique name of the data export job to update, in the following format:
projects/{project}/locations/{region}/instances/{instance}/dataexports/{dataExportId}
where: {project} : Identifier of your project.
{region} : Region where your destination bucket is located. See the list of regions.
{instance} : Identifier of the source Google SecOps instance.
{dataExportId} : UUID identifier of the data export job.
|
startTime
|
google.protobuf. Timestamp |
optional | The updated starting value of the time range for the export. |
endTime
|
google.protobuf. Timestamp |
optional | The updated ending value of the time range for the export. |
gcsBucket
|
string | optional | The updated path to your Google Cloud Storage destination bucket, in the following format:
/projects/{project-id}/buckets/{bucket-name} .Note: You must create the bucket in the same region as your Google SecOps tenant. |
includeLogTypes |
array | optional | The updated, comma-separated list of one or more log types you want to export. If this field is included but the value is left blank, the system exports all log types by default. |
Sample response
If the request is successful, the API returns a confirmation of the update. The response contains the updated field values for the specified job name, along with an updated estimate of the data volume for the export.
{
"name": "projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/dataExports/b4a3c2d1-e8f7-6a5b-4c3d-2e1f0a9b8c7d",
"startTime": "2025-08-01T00:00:00Z",
"endTime": "2025-08-03T00:00:00Z",
"gcsBucket": "projects/chronicle-test/buckets/dataexport-test-bucket2",
"includeLogTypes": [
"projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/logTypes/GCP_DNS"
],
"dataExportStatus": {
"stage": "IN_QUEUE"
},
"estimatedVolume": "15737418240",
"createTime": "2025-08-13T12:00:00Z",
"updateTime": "2025-08-13T12:05:00Z"
}
Response Parameters
Parameter | Type | Description |
---|---|---|
name |
string |
The unique name of the updated data export job. |
startTime |
string |
The updated starting time range |
endTime |
string |
The updated ending time range |
gcsBucket |
string |
The updated path to your Google Cloud Storage destination bucket, in the following format:
/projects/{project-id}/buckets/{bucket-name} .
|
includeLogTypes |
list |
The updated comma-separated list of included log types. |
dataExportStatus.stage |
string |
The status of the export job at the time of update (always IN_QUEUE ). |
estimatedVolume |
string |
The updated estimated export volume in bytes. |
createTime |
string |
The original job creation time. |
updateTime |
string |
The job update time. |
CancelDataExport
You can only cancel an existing job when it is in the IN_QUEUE
status.
Reference Documentation: Method: dataExports.cancel
Request
Endpoint: POST https://chronicle.{region}.rep.googleapis.com/v1alpha/{name}:cancel
Path parameters
Field | Type | Required | Description |
---|---|---|---|
name |
string |
required | The name of the data export job to cancel, in the following format:
projects/{project}/locations/{region}/instances/{instance}/dataexports/{id}
where: {project} : Identifier of your project.
{region} : Region where your destination bucket is located. See the list of regions.
{instance} : Identifier of the source Google SecOps instance.
{id} : UUID identifier of the data export request.
|
Request body
The request body must be empty.
Sample request
POST https://chronicle.us.rep.googleapis.com/v1alpha/
projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/dataExports/b4a3c2d1-e8f7-6a5b-4c3d-2e1f0a9b8c7d:cancel
Sample response
A successful response shows the job's status as CANCELLED
.
{
"name": "projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/dataExports/b4a3c2d1-e8f7-6a5b-4c3d-2e1f0a9b8c7d",
"startTime": "2025-08-01T00:00:00Z",
"endTime": "2025-08-02T00:00:00Z",
"gcsBucket": "projects/chronicle-test/buckets/dataexport-test-bucket",
"includeLogTypes": [
"projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/logTypes/GCP_DNS",
"projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/logTypes/GCP_FIREWALL"
],
"dataExportStatus": {
"stage": "CANCELLED"
},
"estimatedVolume": "10737418240",
"createTime": "2025-08-13T11:00:00Z",
"updateTime": "2025-08-13T11:10:00Z"
}
Troubleshooting common issues
The API provides detailed error messages to help diagnose problems.
Canonical Code | Error Message |
---|---|
INVALID_ARGUMENT | INVALID_REQUEST: Invalid request parameter <Parameter1, Parameter2,..>. Please fix the request parameters and try again. |
NOT_FOUND | BUCKET_NOT_FOUND: The destination Google Cloud Storage bucket <bucketName> does not exist. Please create the destination Google Cloud Storage bucket and try again. |
NOT_FOUND | REQUEST_NOT_FOUND: The dataExportId:<dataExportId> does not exist. Please add a valid dataExportId and try again. |
FAILED_PRECONDITION | BUCKET_INVALID_REGION: The Google Cloud Storage bucket <bucketId>'s region:<region1> is not the same region as the SecOps tenant region:<region2>. Please create the Google Cloud Storage bucket in the same region as SecOps tenant and try again. |
FAILED_PRECONDITION | INSUFFICIENT_PERMISSIONS: The Service Account <P4SA> does not have storage.objects.create , storage.objects.get and storage.buckets.get permissions on the destination Google Cloud Storage bucket <bucketName>. Please provide the required access to the Service Account and try again. |
FAILED_PRECONDITION | INVALID_UPDATE: The request status is in the <status> stage and can't be updated. You can only update the request if the status is in the IN_QUEUE stage. |
FAILED_PRECONDITION | INVALID_CANCELLATION: The request status is in the <status> stage and can't be cancelled. You can only cancel the request if the status is in the IN_QUEUE stage. |
RESOURCE_EXHAUSTED | CONCURRENT_REQUEST_LIMIT_EXCEEDED: Maximum concurrent requests limit <limit> reached for the request size <sizelimit>. Please wait for the existing requests to complete and try again. |
RESOURCE_EXHAUSTED | REQUEST_SIZE_LIMIT_EXCEEDED: The estimated export volume: <estimatedVolume> for the request is greater than maximum allowed export volume: <allowedVolume> per request. Please try again with a request within the allowed export volume limit. |
INTERNAL | INTERNAL_ERROR: An Internal error occurred. Please try again. |
Need more help? Get answers from Community members and Google SecOps professionals.