Stay organized with collections
Save and categorize content based on your preferences.
Export and read your carbon footprint using an API
Carbon Footprint does not have a dedicated public API.
However, you can export your carbon footprint via the BigQuery Data Transfer Service API
and then query the data using the BigQuery API.
Using the BigQuery Data Transfer Service API
To call the BigQuery Data Transfer Service API, you can use the provided
client libraries or call the
REST API directly.
The documentation below describes how to create Carbon Footprint
transfer configs and backfills using the REST API. However, for
convenience you may prefer to make the equivalent API calls using the client
library in your language of choice.
Create an export via REST API
Call the transferConfigs.create endpoint
of the BigQuery Data Transfer Service API to create a transfer, using the following
payload:
NAME with your transfer config name. For example: "Company Carbon
Report"
BILLING_ACCOUNT_IDS with your billing account ID. This value can be a
comma-separated list of billing account IDs. For example:
XXXXXX-XXXXXX-XXXXXX,XXXXXX-XXXXXX-XXXXXX
DATASET with the destination BigQuery dataset ID in the
current project. For example: company_carbon_report
Once the transfer config is created, carbon data will automatically be exported
on the 15th of the month for all future months. To export historical data, see
below.
Run a backfill of historical data via REST API
To export historical carbon data for an existing transfer config (see above),
you must request a backfill on your transfer config.
To create a backfill, send a POST request to the
transferConfigs.startManualRuns
endpoint, using the identifier of the transfer created in the previous step
(for example, projects/0000000000000/locations/us/transferConfigs/00000000-0000-0000-0000-000000000000)
and the following payload:
START_TIME is a timestamp
that specifies the start time of the range to backfill.
For example: 2021-02-15T00:00:00Z.
Note that February 15, 2021 is the earliest date you can specify here,
as it contains the January 2021 data.
END_TIME is a timestamp
that specifies the end time of the range to backfill.
For example: 2022-09-15T00:00:00Z.
You can use the current date.
Query an existing export via API
Before querying the API, write a SQL query that returns the desired data from
the exported dataset. You can test the SQL query in the
BigQuery console.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-25 UTC."],[[["\u003cp\u003eCarbon Footprint data can be exported via the BigQuery Data Transfer Service API, despite not having a dedicated public API.\u003c/p\u003e\n"],["\u003cp\u003eYou can create a carbon footprint transfer using the \u003ccode\u003etransferConfigs.create\u003c/code\u003e endpoint in the BigQuery Data Transfer Service API, specifying details like transfer name, billing accounts, and destination dataset.\u003c/p\u003e\n"],["\u003cp\u003eHistorical carbon data can be exported by requesting a backfill on the existing transfer config using the \u003ccode\u003etransferConfigs.startManualRuns\u003c/code\u003e endpoint and defining a start and end time range.\u003c/p\u003e\n"],["\u003cp\u003eOnce exported to BigQuery, you can then use the BigQuery API or libraries to query your carbon footprint data with SQL queries.\u003c/p\u003e\n"]]],[],null,["# Export and read your carbon footprint using an API\n==================================================\n\nCarbon Footprint does not have a dedicated public API.\nHowever, you can export your carbon footprint via the BigQuery Data Transfer Service API\nand then query the data using the BigQuery API.\n\nUsing the BigQuery Data Transfer Service API\n--------------------------------------------\n\nTo call the BigQuery Data Transfer Service API, you can use the provided\n[client libraries](/bigquery/docs/reference/datatransfer/libraries) or call the\n[REST API](/bigquery/docs/reference/datatransfer/rest) directly.\n\nThe documentation below describes how to create Carbon Footprint\ntransfer configs and backfills using the REST API. However, for\nconvenience you may prefer to make the equivalent API calls using the client\nlibrary in your language of choice.\n\nCreate an export via REST API\n-----------------------------\n\nCall the [`transferConfigs.create` endpoint](/bigquery-transfer/docs/reference/datatransfer/rest/v1/projects.locations.transferConfigs/create)\nof the BigQuery Data Transfer Service API to create a transfer, using the following\npayload: \n\n {\n \"dataSourceId\": \"61cede5a-0000-2440-ad42-883d24f8f7b8\",\n \"displayName\": \"\u003cvar translate=\"no\"\u003eNAME\u003c/var\u003e\",\n \"params\": {\n \"billing_accounts\": \"\u003cvar translate=\"no\"\u003eBILLING_ACCOUNT_IDS\u003c/var\u003e\"\n },\n \"destinationDatasetId\": \"\u003cvar translate=\"no\"\u003eDATASET\u003c/var\u003e\"\n }\n\nReplace:\n\n- \u003cvar translate=\"no\"\u003eNAME\u003c/var\u003e with your transfer config name. For example: \"Company Carbon Report\"\n- \u003cvar translate=\"no\"\u003eBILLING_ACCOUNT_IDS\u003c/var\u003e with your billing account ID. This value can be a comma-separated list of billing account IDs. For example: `XXXXXX-XXXXXX-XXXXXX,XXXXXX-XXXXXX-XXXXXX`\n- \u003cvar translate=\"no\"\u003eDATASET\u003c/var\u003e with the destination BigQuery dataset ID in the current project. For example: `company_carbon_report`\n\nOnce the transfer config is created, carbon data will automatically be exported\non the 15th of the month for all future months. To export historical data, see\nbelow.\n\nRun a backfill of historical data via REST API\n----------------------------------------------\n\nTo export historical carbon data for an existing transfer config (see above),\nyou must request a backfill on your transfer config.\n\nTo create a backfill, send a `POST` request to the\n[`transferConfigs.startManualRuns`](/bigquery-transfer/docs/reference/datatransfer/rest/v1/projects.transferConfigs/startManualRuns)\nendpoint, using the identifier of the transfer created in the previous step\n(for example, `projects/0000000000000/locations/us/transferConfigs/00000000-0000-0000-0000-000000000000`)\nand the following payload: \n\n {\n \"requestedTimeRange\": {\n \"startTime\": \"\u003cvar\u003eSTART_TIME\u003c/var\u003e\",\n \"endTime\": \"\u003cvar\u003eEND_TIME\u003c/var\u003e\"\n }\n }\n\nWhere:\n\n- \u003cvar translate=\"no\"\u003eSTART_TIME\u003c/var\u003e is a [timestamp](https://developers.google.com/protocol-buffers/docs/reference/google.protobuf#timestamp) that specifies the start time of the range to backfill. For example: `2021-02-15T00:00:00Z`. Note that February 15, 2021 is the earliest date you can specify here, as it contains the January 2021 data.\n- \u003cvar translate=\"no\"\u003eEND_TIME\u003c/var\u003e is a [timestamp](https://developers.google.com/protocol-buffers/docs/reference/google.protobuf#timestamp) that specifies the end time of the range to backfill. For example: `2022-09-15T00:00:00Z`. You can use the current date.\n\nQuery an existing export via API\n--------------------------------\n\nBefore querying the API, write a SQL query that returns the desired data from\nthe exported dataset. You can test the SQL query in the\n[BigQuery console](/bigquery/docs/bigquery-web-ui).\n\nAfter you have [configured an export to BigQuery](/carbon-footprint/docs/export),\nuse the [BigQuery API](https://cloud.google.com/bigquery/docs/reference/rest)\nor [BigQuery libraries](/bigquery/docs/reference/libraries) to run the\nquery.\n\nWhat's next?\n------------\n\n- Read an overview of the [BigQuery APIs and Libraries](/bigquery/docs/reference/libraries-overview)\n- Learn more about [running interactive and batch queries](/bigquery/docs/running-queries)"]]