Optional. HCFS URIs of archives to be extracted into the working directory
of each executor. Supported file types:
.jar, .tar, .tar.gz, .tgz, and .zip.
Optional. The arguments to pass to the Spark driver. Do not include arguments
that can be set as batch properties, such as --conf, since a collision
can occur that causes an incorrect batch submission.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-04-09 UTC."],[[["This document provides reference documentation for the `SparkRBatch` class within the Google Cloud Dataproc v1 API, outlining its usage in configuring Apache SparkR batch workloads."],["The `SparkRBatch` class, which is part of the `Google.Cloud.Dataproc.V1` namespace, implements multiple interfaces, including `IMessage`, `IEquatable`, `IDeepCloneable`, and `IBufferMessage`, for its functionality."],["The document lists the available versions of the `SparkRBatch` class, ranging from version 5.0.0 to the latest 5.17.0, accessible via provided links, with version 5.4.0 being specifically highlighted in the documentation."],["The `SparkRBatch` class includes several properties such as `ArchiveUris`, `Args`, `FileUris`, and `MainRFileUri`, allowing users to configure archive locations, command-line arguments, file locations, and the main R file, respectively."],["The page contains constructor information for `SparkRBatch`, including both a parameterless constructor and one that takes another `SparkRBatch` object as an argument."]]],[]]