[[["わかりやすい","easyToUnderstand","thumb-up"],["問題の解決に役立った","solvedMyProblem","thumb-up"],["その他","otherUp","thumb-up"]],[["わかりにくい","hardToUnderstand","thumb-down"],["情報またはサンプルコードが不正確","incorrectInformationOrSampleCode","thumb-down"],["必要な情報 / サンプルがない","missingTheInformationSamplesINeed","thumb-down"],["翻訳に関する問題","translationIssue","thumb-down"],["その他","otherDown","thumb-down"]],["最終更新日 2025-03-24 UTC。"],[[["The Dataflow service's Dynamic Work Rebalancing feature automatically redistributes work among workers based on runtime conditions such as work imbalances or varying processing times."],["Dynamic work rebalancing is limited to parallel data processing stages, like reading from external sources or working with materialized `PCollection`s, and is restricted by the number of elements or shards in those stages."],["If you have custom data sources, dynamic work rebalancing requires implementing specific methods in your data source, such as `splitAtFraction` in Java or `try_split` and `position_at_fraction` in Python, in order to function correctly."],["Dynamic work rebalancing cannot further divide and redistribute a single record that is processing slower than the rest, potentially causing delays."],["Setting a fixed number of shards for your pipeline's output limits the parallelization that Dataflow can perform, thereby impacting the effectiveness of dynamic work rebalancing."]]],[]]