在 Cloud Composer 中推出数据库保留政策之前,我们建议通过数据库清理 DAG 采用其他方法来自动执行数据库清理。此方法在 Cloud Composer 3 中已废弃。此 DAG 会执行多余的工作,您可以将其移除并替换为数据库保留政策,从而减少资源消耗。
数据库大小限制
随着时间的推移,环境的 Airflow 数据库会存储越来越多的数据。这些数据包括与过往 DAG 运行、任务和其他 Airflow 操作相关的信息和日志。
如果 Airflow 数据库大小超过 20 GB,则无法将环境升级到更高版本。
如果 Airflow 数据库大小超过 20 GB,则无法创建快照。
维护数据库性能
Airflow 数据库性能问题可能会导致整体 DAG 执行问题。观察“数据库 CPU 和内存用量”统计信息。如果 CPU 和内存利用率接近上限,则表示数据库过载并需要扩缩。
Airflow 数据库可用的资源量由环境的环境大小属性控制。如需扩缩数据库,请将环境大小更改为更大的层级。增加环境大小会增加环境的费用。
[[["易于理解","easyToUnderstand","thumb-up"],["解决了我的问题","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["很难理解","hardToUnderstand","thumb-down"],["信息或示例代码不正确","incorrectInformationOrSampleCode","thumb-down"],["没有我需要的信息/示例","missingTheInformationSamplesINeed","thumb-down"],["翻译问题","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["最后更新时间 (UTC):2025-04-02。"],[[["This page outlines how to manually clean up the Airflow database in Cloud Composer 3 environments, as well as automatic alternatives."],["Cloud Composer offers a database retention policy that automatically removes records older than a specified period, and it is preferred over the older database cleanup DAG."],["Exceeding a 20 GB database size in Airflow prevents environment upgrades and snapshot creation, making regular cleanup essential."],["The `gcloud composer environments run` command can be used to manually trim the database, removing entries older than the specified retention period."],["Database performance issues can cause DAG execution problems, and scaling up the environment size can address these issues, while also using proper Xcom practices to avoid issues with the database."]]],[]]