Stay organized with collections
Save and categorize content based on your preferences.
To create AML AI datasets in an instance, you need to stage data
in BigQuery within that Google Cloud project. The following sections
show one way of preparing these datasets and tables.
Create a BigQuery output dataset
Run the following command to create a dataset to be
used to send the pipeline outputs to BigQuery. In the following
command, select a name for
BQ_OUTPUT_DATASET_NAME
that contains only letters (uppercase or lowercase), numbers, and underscores.
You cannot use hyphens.
Permissions required for this task
To perform this task, you must have been granted the following permissions:
Upload your financial institution's transaction data into the dataset tables.
For more information, see any of the
BigQuery quickstarts.
Grant access to the BigQuery datasets
The API automatically creates a service account in your project. The service
account needs access to the BigQuery input and output datasets.
For PROJECT_NUMBER, use the
project number associated with
PROJECT_ID. You can find the project
number on the IAM Settings page.
Permissions required for this task
To perform this task, you must have been granted the following permissions:
Permissions
bigquery.datasets.setIamPolicy
bigquery.datasets.update
Install jq on your
development machine. If you cannot install jq on your development machine,
you can use Cloud Shell or one of the other methods for
granting access to a resource
found in the BigQuery documentation.
Run the following commands to grant read access to the input dataset and its
tables.
# The BigQuery input dataset name. You created this dataset and# uploaded the financial data into it in a previous step. This dataset should be# stored in the Google Cloud project.exportBQ_INPUT_DATASET_NAME="BQ_INPUT_DATASET_NAME"# The BigQuery tables in the input dataset. These tables should# be part of the same project as the intended instance.# Make sure to replace each table variable with the appropriate table name.exportPARTY_TABLE="PARTY_TABLE"exportACCOUNT_PARTY_LINK_TABLE="ACCOUNT_PARTY_LINK_TABLE"exportTRANSACTION_TABLE="TRANSACTION_TABLE"exportRISK_CASE_EVENT_TABLE="RISK_CASE_EVENT_TABLE"# Optional tableexportPARTY_SUPPLEMENTARY_DATA_TABLE="PARTY_SUPPLEMENTARY_DATA_TABLE"# Registered parties tableexportPARTY_REGISTRATION_TABLE="PARTY_REGISTRATION_TABLE"# Grant the API read access to the BigQuery dataset.# Update the current access permissions on the BigQuery dataset and store in a temp file.# Note: This step requires jq as a dependency.# If jq is not available, the file /tmp/mydataset.json may be created manually.
bqshow--format=prettyjson"PROJECT_ID:BQ_INPUT_DATASET_NAME"|jq'.access+=[{"role":"READER","userByEmail":"service-PROJECT_NUMBER@gcp-sa-financialservices.iam.gserviceaccount.com" }]'>/tmp/mydataset.json
# Update the BigQuery dataset access permissions using the temp file.
bqupdate--source/tmp/mydataset.json"PROJECT_ID:BQ_INPUT_DATASET_NAME"# Grant the API read access to the BigQuery table if the table is provided.forTABLEin$PARTY_TABLE$TRANSACTION_TABLE$ACCOUNT_PARTY_LINK_TABLE$RISK_CASE_EVENT_TABLE$PARTY_SUPPLEMENTARY_DATA_TABLE$PARTY_REGISTRATION_TABLE;do[-nTABLE] && bqadd-iam-policy-binding\--member="serviceAccount:service-PROJECT_NUMBER@gcp-sa-financialservices.iam.gserviceaccount.com"--role="roles/bigquery.dataViewer"\PROJECT_ID:BQ_INPUT_DATASET_NAME.${TABLE}done
Run the following commands to grant write access to the output dataset.
# Note: This step requires jq as a dependency.# If jq isn't available, the file /tmp/mydataset.json may be created manually.
bqshow--format=prettyjsonPROJECT_ID:BQ_OUTPUT_DATASET_NAME|jq'.access+=[{"role":"roles/bigquery.dataEditor","userByEmail":"service-PROJECT_NUMBER@gcp-sa-financialservices.iam.gserviceaccount.com" }]'>/tmp/perms.json
bqupdate--source/tmp/perms.jsonPROJECT_ID:BQ_OUTPUT_DATASET_NAME
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-29 UTC."],[[["\u003cp\u003eAML AI datasets are created by staging data in BigQuery within a Google Cloud project, necessitating the creation of both input and output datasets.\u003c/p\u003e\n"],["\u003cp\u003eCreating the BigQuery output dataset requires the \u003ccode\u003ebigquery.datasets.create\u003c/code\u003e permission, and the dataset name must contain only letters, numbers, and underscores, excluding hyphens.\u003c/p\u003e\n"],["\u003cp\u003eCreating the BigQuery input dataset also requires the \u003ccode\u003ebigquery.datasets.create\u003c/code\u003e permission, and it's where a financial institution's transaction data will be inputted.\u003c/p\u003e\n"],["\u003cp\u003eThe AML input data model schema is provided in CSV and JSON formats, including individual JSON files for tables like party, account_party_link, transaction, risk_case_event, and party_supplementary_data, along with a party registration table in JSON format.\u003c/p\u003e\n"],["\u003cp\u003eThe service account automatically created by the API needs \u003ccode\u003ebigquery.datasets.setIamPolicy\u003c/code\u003e and \u003ccode\u003ebigquery.datasets.update\u003c/code\u003e permissions to access the BigQuery input and output datasets, specifically requiring read access for the input dataset and write access for the output dataset.\u003c/p\u003e\n"]]],[],null,["# Prepare BigQuery datasets and tables\n\nTo create AML AI datasets in an instance, you need to stage data\nin BigQuery within that Google Cloud project. The following sections\nshow one way of preparing these datasets and tables.\n\nCreate a BigQuery output dataset\n--------------------------------\n\nRun the following command to [create a dataset](/bigquery/docs/datasets) to be\nused to send the pipeline outputs to BigQuery. In the following\ncommand, select a name for\n\u003cvar class=\"edit\" scope=\"BQ_OUTPUT_DATASET_NAME\" translate=\"no\"\u003eBQ_OUTPUT_DATASET_NAME\u003c/var\u003e\nthat contains only letters (uppercase or lowercase), numbers, and underscores.\n*You cannot use hyphens*.\n\n#### Permissions required for this task\n\nTo perform this task, you must have been granted the following permissions:\n\n**Permissions**\n\n- `bigquery.datasets.create` \n\n### bash\n\n bq mk \\\n --location=\u003cvar translate=\"no\"\u003eLOCATION\u003c/var\u003e \\\n --project_id=\u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e \\\n \u003cvar translate=\"no\"\u003eBQ_OUTPUT_DATASET_NAME\u003c/var\u003e\n\n### powershell\n\n bq mk `\n --location=\u003cvar translate=\"no\"\u003e\u003cspan class=\"devsite-syntax-n\"\u003eLOCATION\u003c/span\u003e\u003c/var\u003e `\n --project_id=\u003cvar translate=\"no\"\u003e\u003cspan class=\"devsite-syntax-n\"\u003ePROJECT_ID\u003c/span\u003e\u003c/var\u003e `\n \u003cvar translate=\"no\"\u003e\u003cspan class=\"devsite-syntax-n\"\u003eBQ_OUTPUT_DATASET_NAME\u003c/span\u003e\u003c/var\u003e\n\nTo see the AML AI outputs, see [AML output data model](/financial-services/anti-money-laundering/docs/reference/schemas/aml-output-data-model).\n\nCreate the BigQuery input dataset\n---------------------------------\n\nCreate a BigQuery input dataset. Later, you will input your\nfinancial institution's transaction data into this dataset.\n\n#### Permissions required for this task\n\nTo perform this task, you must have been granted the following permissions:\n\n**Permissions**\n\n- `bigquery.datasets.create` \n\n### gcloud\n\n bq mk \\\n --location=\u003cvar translate=\"no\"\u003eLOCATION\u003c/var\u003e \\\n --project_id=\u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e \\\n \u003cvar translate=\"no\"\u003eBQ_INPUT_DATASET_NAME\u003c/var\u003e\n\n### Powershell\n\n bq mk `\n --location=\u003cvar translate=\"no\"\u003e\u003cspan class=\"devsite-syntax-n\"\u003eLOCATION\u003c/span\u003e\u003c/var\u003e `\n --project_id=\u003cvar translate=\"no\"\u003e\u003cspan class=\"devsite-syntax-n\"\u003ePROJECT_ID\u003c/span\u003e\u003c/var\u003e `\n \u003cvar translate=\"no\"\u003e\u003cspan class=\"devsite-syntax-n\"\u003eBQ_INPUT_DATASET_NAME\u003c/span\u003e\u003c/var\u003e\n\nCreate the BigQuery input dataset tables and upload the transaction data\n------------------------------------------------------------------------\n\nWe provide the [AML input data model](/financial-services/anti-money-laundering/docs/reference/schemas/aml-input-data-model) schema in the following formats:\n\n- A single CSV file [`aml-input-data-model.csv`](/static/financial-services/anti-money-laundering/docs/reference/schemas/aml-input-data-model.csv) with all tables included\n- A single JSON file [`aml-input-data-model.json`](/static/financial-services/anti-money-laundering/docs/reference/schemas/aml-input-data-model.json) with all tables included\n- Individual JSON files for each table:\n - [`party.json`](/static/financial-services/anti-money-laundering/docs/reference/schemas/party.json)\n - [`account_party_link.json`](/static/financial-services/anti-money-laundering/docs/reference/schemas/account_party_link.json)\n - [`transaction.json`](/static/financial-services/anti-money-laundering/docs/reference/schemas/transaction.json)\n - [`risk_case_event.json`](/static/financial-services/anti-money-laundering/docs/reference/schemas/risk_case_event.json)\n - [`party_supplementary_data.json`](/static/financial-services/anti-money-laundering/docs/reference/schemas/party_supplementary_data.json)\n\nWe provide the\n[party registration table](/financial-services/anti-money-laundering/docs/register-parties#prepare-party-registration-tables)\nin JSON format. You use this table later when you register parties in order to\ncreate prediction results.\n\n- [`party_registration.json`](/static/financial-services/anti-money-laundering/docs/reference/schemas/party_registration.json)\n\nTo download the JSON file for each table and use it to create the associated\nBigQuery table by applying the schema, run the following\ncommand.\n\n#### Permissions required for this task\n\nTo perform this task, you must have been granted the following permissions:\n\n**Permissions**\n\n- `bigquery.datasets.create` \n\n for table in party_registration party account_party_link transaction risk_case_event party_supplementary_data interaction_event\n do\n curl -O \"https://cloud.google.com/financial-services/anti-money-laundering/docs/reference/schemas/${table}.json\"\n bq mk --table --project_id \u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e \u003cvar translate=\"no\"\u003eBQ_INPUT_DATASET_NAME\u003c/var\u003e.$table $table.json\n done\n\nUpload your financial institution's transaction data into the dataset tables.\nFor more information, see any of the\n[BigQuery quickstarts](/bigquery/docs/quickstarts).\n\nGrant access to the BigQuery datasets\n-------------------------------------\n\nThe API automatically creates a service account in your project. The service\naccount needs access to the BigQuery input and output datasets.\n\nFor \u003cvar class=\"edit\" scope=\"PROJECT_NUMBER\" translate=\"no\"\u003ePROJECT_NUMBER\u003c/var\u003e, use the\nproject number associated with\n\u003cvar class=\"edit\" scope=\"PROJECT_ID\" translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e. You can find the project\nnumber on the [IAM Settings](https://console.cloud.google.com/iam-admin/settings) page.\n\n#### Permissions required for this task\n\nTo perform this task, you must have been granted the following permissions:\n\n**Permissions**\n\n- `bigquery.datasets.setIamPolicy`\n- `bigquery.datasets.update`\n\n1. Install [`jq`](https://jqlang.github.io/jq/download/) on your development machine. If you cannot install `jq` on your development machine, you can use Cloud Shell or one of the other methods for [granting access to a resource](/bigquery/docs/control-access-to-resources-iam#grant_access_to_a_resource) found in the BigQuery documentation.\n2. Run the following commands to grant read access to the input dataset and its\n tables.\n\n # The BigQuery input dataset name. You created this dataset and\n # uploaded the financial data into it in a previous step. This dataset should be\n # stored in the Google Cloud project.\n\n export BQ_INPUT_DATASET_NAME=\"\u003cvar translate=\"no\"\u003eBQ_INPUT_DATASET_NAME\u003c/var\u003e\"\n\n # The BigQuery tables in the input dataset. These tables should\n # be part of the same project as the intended instance.\n # Make sure to replace each table variable with the appropriate table name.\n export PARTY_TABLE=\"\u003cvar translate=\"no\"\u003ePARTY_TABLE\u003c/var\u003e\"\n export ACCOUNT_PARTY_LINK_TABLE=\"\u003cvar translate=\"no\"\u003eACCOUNT_PARTY_LINK_TABLE\u003c/var\u003e\"\n export TRANSACTION_TABLE=\"\u003cvar translate=\"no\"\u003eTRANSACTION_TABLE\u003c/var\u003e\"\n export RISK_CASE_EVENT_TABLE=\"\u003cvar translate=\"no\"\u003eRISK_CASE_EVENT_TABLE\u003c/var\u003e\"\n # Optional table\n export PARTY_SUPPLEMENTARY_DATA_TABLE=\"\u003cvar translate=\"no\"\u003ePARTY_SUPPLEMENTARY_DATA_TABLE\u003c/var\u003e\"\n # Registered parties table\n export PARTY_REGISTRATION_TABLE=\"\u003cvar translate=\"no\"\u003ePARTY_REGISTRATION_TABLE\u003c/var\u003e\"\n\n # Grant the API read access to the BigQuery dataset.\n # Update the current access permissions on the BigQuery dataset and store in a temp file.\n # Note: This step requires jq as a dependency.\n # If jq is not available, the file /tmp/mydataset.json may be created manually.\n bq show --format=prettyjson \"\u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e:\u003cvar translate=\"no\"\u003eBQ_INPUT_DATASET_NAME\u003c/var\u003e\" | jq '.access+=[{\"role\":\"READER\",\"userByEmail\":\"service-\u003cvar translate=\"no\"\u003ePROJECT_NUMBER\u003c/var\u003e@gcp-sa-financialservices.iam.gserviceaccount.com\" }]'\u003e /tmp/mydataset.json\n # Update the BigQuery dataset access permissions using the temp file.\n bq update --source /tmp/mydataset.json \"\u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e:\u003cvar translate=\"no\"\u003eBQ_INPUT_DATASET_NAME\u003c/var\u003e\"\n\n # Grant the API read access to the BigQuery table if the table is provided.\n for TABLE in $PARTY_TABLE $TRANSACTION_TABLE $ACCOUNT_PARTY_LINK_TABLE $RISK_CASE_EVENT_TABLE $PARTY_SUPPLEMENTARY_DATA_TABLE $PARTY_REGISTRATION_TABLE; do\n [ -n TABLE ] && bq add-iam-policy-binding \\\n --member=\"serviceAccount:service-\u003cvar translate=\"no\"\u003ePROJECT_NUMBER\u003c/var\u003e@gcp-sa-financialservices.iam.gserviceaccount.com\" --role=\"roles/bigquery.dataViewer\" \\\n \u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e:\u003cvar translate=\"no\"\u003eBQ_INPUT_DATASET_NAME\u003c/var\u003e.${TABLE}\n done\n\n3. Run the following commands to grant write access to the output dataset.\n\n # Note: This step requires jq as a dependency.\n # If jq isn't available, the file /tmp/mydataset.json may be created manually.\n bq show --format=prettyjson \u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e:\u003cvar translate=\"no\"\u003eBQ_OUTPUT_DATASET_NAME\u003c/var\u003e | jq '.access+=[{\"role\":\"roles/bigquery.dataEditor\",\"userByEmail\":\"service-\u003cvar translate=\"no\"\u003ePROJECT_NUMBER\u003c/var\u003e@gcp-sa-financialservices.iam.gserviceaccount.com\" }]'\u003e /tmp/perms.json\n\n bq update --source /tmp/perms.json \u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e:\u003cvar translate=\"no\"\u003eBQ_OUTPUT_DATASET_NAME\u003c/var\u003e"]]