DAG를 트리거합니다. 예를 들어 수동으로 트리거하거나 예약된 시간에 실행될 때까지 기다릴 수 있습니다.
DAG 예시:
importdatetimeimportosimportairflowfromairflow.providers.postgres.operators.postgresimportPostgresOperatorSQL_DATABASE=os.environ["SQL_DATABASE"]withairflow.DAG("airflow_db_connection_example",start_date=datetime.datetime(2024,1,1),schedule_interval=None,catchup=False)asdag:PostgresOperator(task_id="run_airflow_db_query",dag=dag,postgres_conn_id="airflow_db",database=SQL_DATABASE,sql="SELECT * FROM dag LIMIT 10;",)
[[["이해하기 쉬움","easyToUnderstand","thumb-up"],["문제가 해결됨","solvedMyProblem","thumb-up"],["기타","otherUp","thumb-up"]],[["이해하기 어려움","hardToUnderstand","thumb-down"],["잘못된 정보 또는 샘플 코드","incorrectInformationOrSampleCode","thumb-down"],["필요한 정보/샘플이 없음","missingTheInformationSamplesINeed","thumb-down"],["번역 문제","translationIssue","thumb-down"],["기타","otherDown","thumb-down"]],["최종 업데이트: 2025-04-02(UTC)"],[[["This page outlines how to connect to and query the Cloud SQL instance that hosts the Airflow database for Cloud Composer environments."],["While direct access to the Airflow database is possible, it is generally recommended to utilize the Airflow REST API or CLI commands instead."],["You can execute SQL queries on the Airflow database by creating a DAG with `PostgresOperator` operators and specifying your SQL query in the `sql` parameter, while setting schedule intervals accordingly to prevent multiple runs."],["Directly adding custom tables or modifying the schema of the Airflow database is strictly prohibited."],["Backing up the Airflow database contents should be done using snapshots rather than dumping database contents to a bucket."]]],[]]