[[["容易理解","easyToUnderstand","thumb-up"],["確實解決了我的問題","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["難以理解","hardToUnderstand","thumb-down"],["資訊或程式碼範例有誤","incorrectInformationOrSampleCode","thumb-down"],["缺少我需要的資訊/範例","missingTheInformationSamplesINeed","thumb-down"],["翻譯問題","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["上次更新時間:2025-07-10 (世界標準時間)。"],[[["The 2.2 image version includes a consistent set of components across multiple release dates, such as Apache Atlas 2.2.0, Apache Flink 1.17.0, Apache Hadoop 3.3.6, and Apache Spark 3.5.1."],["Several components are installed by default, including Apache Hadoop, Hive, Pig, Spark, Tez, BigQuery Connector, Cloud Storage Connector, Conscrypt, Java, Python (conda 23.11.0 with Python 3.11), R, and Scala."],["Optional components like Apache Flink, Hive WebHCat, Hudi, Docker, JupyterLab Notebook, Ranger, Solr, Trino, Zeppelin Notebook, and Zookeeper are available for use."],["Initialization actions are provided for Apache Atlas, Kafka, Sqoop, Hue, and Oozie."],["Data Lineage and Legacy Agents are not available in the 2.2 image version, with monitoring agent defaults only available if the Ops Agent is installed, and certain Hudi procedures are unsupported on Cloud Storage."]]],[]]