隨著時間推移,環境中的 Airflow 資料庫會儲存越來越多資料。這類資料包括與過去 DAG 執行作業、工作和其他 Airflow 作業相關的資訊和記錄。
如果 Airflow 資料庫大小超過 20 GB,您就無法將環境升級至較新版本。
如果 Airflow 資料庫大小超過 20 GB,就無法建立快照。
維持資料庫效能
Airflow 資料庫效能問題可能會導致整體 DAG 執行問題。觀察資料庫 CPU 和記憶體用量統計資料。如果 CPU 和記憶體使用率接近上限,就表示資料庫超載,需要進行調整。Airflow 資料庫可用的資源量,由環境的環境大小屬性控管。如要擴充資料庫,請變更環境大小,改為較大的等級。增加環境大小會提高環境費用。
[[["容易理解","easyToUnderstand","thumb-up"],["確實解決了我的問題","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["難以理解","hardToUnderstand","thumb-down"],["資訊或程式碼範例有誤","incorrectInformationOrSampleCode","thumb-down"],["缺少我需要的資訊/範例","missingTheInformationSamplesINeed","thumb-down"],["翻譯問題","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["上次更新時間:2025-06-16 (世界標準時間)。"],[[["This page outlines how to manually clean up the Airflow database in Cloud Composer 3 environments, as well as automatic alternatives."],["Cloud Composer offers a database retention policy that automatically removes records older than a specified period, and it is preferred over the older database cleanup DAG."],["Exceeding a 20 GB database size in Airflow prevents environment upgrades and snapshot creation, making regular cleanup essential."],["The `gcloud composer environments run` command can be used to manually trim the database, removing entries older than the specified retention period."],["Database performance issues can cause DAG execution problems, and scaling up the environment size can address these issues, while also using proper Xcom practices to avoid issues with the database."]]],[]]