A Dataproc job for running Apache Hadoop
MapReduce <https://hadoop.apache.org/docs/current/hadoop-mapreduce-client/hadoop-mapreduce-client-core/MapReduceTutorial.html>
jobs on Apache Hadoop
YARN <https://hadoop.apache.org/docs/r2.7.1/hadoop-yarn/hadoop-yarn-site/YARN.html>.
This message has oneof_ fields (mutually exclusive fields).
For each oneof, at most one member field can be set at the same time.
Setting any member of the oneof automatically clears all other
members.
str
The HCFS URI of the jar file containing the
main class. Examples:
'gs://foo-bucket/analytics-binaries/extract-useful-metrics-mr.jar'
'hdfs:/tmp/test-samples/custom-wordcount.jar'
'file:///home/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar'
This field is a member of oneof_ driver.
main_class
str
The name of the driver's main class. The jar file containing
the class must be in the default CLASSPATH or specified in
jar_file_uris.
This field is a member of oneof_ driver.
args
MutableSequence[str]
Optional. The arguments to pass to the driver. Do not
include arguments, such as -libjars or -Dfoo=bar,
that can be set as job properties, since a collision might
occur that causes an incorrect job submission.
jar_file_uris
MutableSequence[str]
Optional. Jar file URIs to add to the
CLASSPATHs of the Hadoop driver and tasks.
file_uris
MutableSequence[str]
Optional. HCFS (Hadoop Compatible Filesystem)
URIs of files to be copied to the working
directory of Hadoop drivers and distributed
tasks. Useful for naively parallel tasks.
archive_uris
MutableSequence[str]
Optional. HCFS URIs of archives to be
extracted in the working directory of Hadoop
drivers and tasks. Supported file types:
.jar, .tar, .tar.gz, .tgz, or .zip.
properties
MutableMapping[str, str]
Optional. A mapping of property names to values, used to
configure Hadoop. Properties that conflict with values set
by the Dataproc API might be overwritten. Can include
properties set in /etc/hadoop/conf/*-site and classes in
user code.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-07 UTC."],[],[],null,["# Class HadoopJob (5.21.0)\n\nVersion latestkeyboard_arrow_down\n\n- [5.21.0 (latest)](/python/docs/reference/dataproc/latest/google.cloud.dataproc_v1.types.HadoopJob)\n- [5.20.0](/python/docs/reference/dataproc/5.20.0/google.cloud.dataproc_v1.types.HadoopJob)\n- [5.18.1](/python/docs/reference/dataproc/5.18.1/google.cloud.dataproc_v1.types.HadoopJob)\n- [5.17.1](/python/docs/reference/dataproc/5.17.1/google.cloud.dataproc_v1.types.HadoopJob)\n- [5.16.0](/python/docs/reference/dataproc/5.16.0/google.cloud.dataproc_v1.types.HadoopJob)\n- [5.15.1](/python/docs/reference/dataproc/5.15.1/google.cloud.dataproc_v1.types.HadoopJob)\n- [5.14.0](/python/docs/reference/dataproc/5.14.0/google.cloud.dataproc_v1.types.HadoopJob)\n- [5.13.0](/python/docs/reference/dataproc/5.13.0/google.cloud.dataproc_v1.types.HadoopJob)\n- [5.12.0](/python/docs/reference/dataproc/5.12.0/google.cloud.dataproc_v1.types.HadoopJob)\n- [5.10.2](/python/docs/reference/dataproc/5.10.2/google.cloud.dataproc_v1.types.HadoopJob)\n- [5.9.3](/python/docs/reference/dataproc/5.9.3/google.cloud.dataproc_v1.types.HadoopJob)\n- [5.8.0](/python/docs/reference/dataproc/5.8.0/google.cloud.dataproc_v1.types.HadoopJob)\n- [5.7.0](/python/docs/reference/dataproc/5.7.0/google.cloud.dataproc_v1.types.HadoopJob)\n- [5.6.0](/python/docs/reference/dataproc/5.6.0/google.cloud.dataproc_v1.types.HadoopJob)\n- [5.5.1](/python/docs/reference/dataproc/5.5.1/google.cloud.dataproc_v1.types.HadoopJob)\n- [5.4.3](/python/docs/reference/dataproc/5.4.3/google.cloud.dataproc_v1.types.HadoopJob)\n- [5.3.0](/python/docs/reference/dataproc/5.3.0/google.cloud.dataproc_v1.types.HadoopJob)\n- [5.2.0](/python/docs/reference/dataproc/5.2.0/google.cloud.dataproc_v1.types.HadoopJob)\n- [5.1.0](/python/docs/reference/dataproc/5.1.0/google.cloud.dataproc_v1.types.HadoopJob)\n- [5.0.3](/python/docs/reference/dataproc/5.0.3/google.cloud.dataproc_v1.types.HadoopJob)\n- [4.0.3](/python/docs/reference/dataproc/4.0.3/google.cloud.dataproc_v1.types.HadoopJob)\n- [3.3.2](/python/docs/reference/dataproc/3.3.2/google.cloud.dataproc_v1.types.HadoopJob)\n- [3.2.0](/python/docs/reference/dataproc/3.2.0/google.cloud.dataproc_v1.types.HadoopJob)\n- [3.1.1](/python/docs/reference/dataproc/3.1.1/google.cloud.dataproc_v1.types.HadoopJob)\n- [3.0.0](/python/docs/reference/dataproc/3.0.0/google.cloud.dataproc_v1.types.HadoopJob)\n- [2.6.2](/python/docs/reference/dataproc/2.6.2/google.cloud.dataproc_v1.types.HadoopJob)\n- [2.5.0](/python/docs/reference/dataproc/2.5.0/google.cloud.dataproc_v1.types.HadoopJob)\n- [2.4.0](/python/docs/reference/dataproc/2.4.0/google.cloud.dataproc_v1.types.HadoopJob)\n- [2.3.1](/python/docs/reference/dataproc/2.3.1/google.cloud.dataproc_v1.types.HadoopJob)\n- [2.2.0](/python/docs/reference/dataproc/2.2.0/google.cloud.dataproc_v1.types.HadoopJob)\n- [2.0.2](/python/docs/reference/dataproc/2.0.2/google.cloud.dataproc_v1.types.HadoopJob)\n- [1.1.3](/python/docs/reference/dataproc/1.1.3/google.cloud.dataproc_v1.types.HadoopJob)\n- [1.0.1](/python/docs/reference/dataproc/1.0.1/google.cloud.dataproc_v1.types.HadoopJob)\n- [0.8.2](/python/docs/reference/dataproc/0.8.2/google.cloud.dataproc_v1.types.HadoopJob)\n- [0.7.0](/python/docs/reference/dataproc/0.7.0/google.cloud.dataproc_v1.types.HadoopJob)\n- [0.6.1](/python/docs/reference/dataproc/0.6.1/google.cloud.dataproc_v1.types.HadoopJob)\n- [0.5.0](/python/docs/reference/dataproc/0.5.0/google.cloud.dataproc_v1.types.HadoopJob) \n\n HadoopJob(mapping=None, *, ignore_unknown_fields=False, **kwargs)\n\nA Dataproc job for running `Apache Hadoop\nMapReduce \u003chttps://hadoop.apache.org/docs/current/hadoop-mapreduce-client/hadoop-mapreduce-client-core/MapReduceTutorial.html\u003e`**jobs on `Apache Hadoop\nYARN \u003chttps://hadoop.apache.org/docs/r2.7.1/hadoop-yarn/hadoop-yarn-site/YARN.html\u003e`**.\n\nThis message has `oneof`_ fields (mutually exclusive fields).\nFor each oneof, at most one member field can be set at the same time.\nSetting any member of the oneof automatically clears all other\nmembers.\n\n.. _oneof: \u003chttps://proto-plus-python.readthedocs.io/en/stable/fields.html#oneofs-mutually-exclusive-fields\u003e\n\nClasses\n-------\n\n### PropertiesEntry\n\n PropertiesEntry(mapping=None, *, ignore_unknown_fields=False, **kwargs)\n\nThe abstract base class for a message."]]