On September 15, 2026, all Cloud Composer 1 and Cloud Composer 2 version 2.0.x environments will reach their planned end of life, and you will not be able to use them. We recommend planning migration to Cloud Composer 3.
This page explains how to connect to a Cloud SQL instance that runs
the Airflow database of your Cloud Composer
environment and run SQL queries.
For example, you might want to run queries directly on the Airflow database,
make database backups, gather statistics based on the database content, or
retrieve any other custom information from the database.
Before you begin
Run a SQL query on the Airflow database
To connect to the Airflow database:
Create a DAG with one or more SQLExecuteQueryOperator operators. To get
started, you can use the example DAG.
In the sql parameter of the operator, specify your SQL query.
Trigger the DAG, for example, you can do it
manually or wait until it runs on a schedule.
Example DAG:
importdatetimeimportosimportairflowfromairflow.providers.common.sql.operators.sqlimportSQLExecuteQueryOperatorSQL_DATABASE=os.environ["SQL_DATABASE"]withairflow.DAG("airflow_db_connection_example",start_date=datetime.datetime(2025,1,1),schedule_interval=None,catchup=False)asdag:SQLExecuteQueryOperator(task_id="run_airflow_db_query",dag=dag,conn_id="airflow_db",database=SQL_DATABASE,sql="SELECT * FROM dag LIMIT 10;",)
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-19 UTC."],[[["\u003cp\u003eThis page details how to connect to and run SQL queries on the Airflow database of your Cloud Composer 1 environment.\u003c/p\u003e\n"],["\u003cp\u003eDirectly accessing the Airflow database is discouraged; the Airflow REST API or Airflow CLI commands are the recommended alternatives.\u003c/p\u003e\n"],["\u003cp\u003eConnecting to the Airflow database involves creating and uploading a DAG that utilizes the \u003ccode\u003ePostgresOperator\u003c/code\u003e to specify and run the SQL query.\u003c/p\u003e\n"],["\u003cp\u003eAvoid adding custom tables or modifying the schema of the existing Airflow database to prevent complications.\u003c/p\u003e\n"],["\u003cp\u003eBacking up the environment's data should be done with snapshots instead of dumping the database.\u003c/p\u003e\n"]]],[],null,[]]