Class HadoopJob (5.3.0)

HadoopJob(mapping=None, *, ignore_unknown_fields=False, **kwargs)

A Dataproc job for running Apache Hadoop MapReduce <https://hadoop.apache.org/docs/current/hadoop-mapreduce-client/hadoop-mapreduce-client-core/MapReduceTutorial.html> jobs on Apache Hadoop YARN <https://hadoop.apache.org/docs/r2.7.1/hadoop-yarn/hadoop-yarn-site/YARN.html>.

This message has oneof_ fields (mutually exclusive fields). For each oneof, at most one member field can be set at the same time. Setting any member of the oneof automatically clears all other members.

.. _oneof: https://proto-plus-python.readthedocs.io/en/stable/fields.html#oneofs-mutually-exclusive-fields

Attributes

NameDescription
main_jar_file_uri str
The HCFS URI of the jar file containing the main class. Examples: 'gs://foo-bucket/analytics-binaries/extract-useful-metrics-mr.jar' 'hdfs:/tmp/test-samples/custom-wordcount.jar' 'file:///home/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar' This field is a member of oneof_ driver.
main_class str
The name of the driver's main class. The jar file containing the class must be in the default CLASSPATH or specified in jar_file_uris. This field is a member of oneof_ driver.
args MutableSequence[str]
Optional. The arguments to pass to the driver. Do not include arguments, such as -libjars or -Dfoo=bar, that can be set as job properties, since a collision may occur that causes an incorrect job submission.
jar_file_uris MutableSequence[str]
Optional. Jar file URIs to add to the CLASSPATHs of the Hadoop driver and tasks.
file_uris MutableSequence[str]
Optional. HCFS (Hadoop Compatible Filesystem) URIs of files to be copied to the working directory of Hadoop drivers and distributed tasks. Useful for naively parallel tasks.
archive_uris MutableSequence[str]
Optional. HCFS URIs of archives to be extracted in the working directory of Hadoop drivers and tasks. Supported file types: .jar, .tar, .tar.gz, .tgz, or .zip.
properties MutableMapping[str, str]
Optional. A mapping of property names to values, used to configure Hadoop. Properties that conflict with values set by the Dataproc API may be overwritten. Can include properties set in /etc/hadoop/conf/*-site and classes in user code.
logging_config google.cloud.dataproc_v1.types.LoggingConfig
Optional. The runtime log config for job execution.

Classes

PropertiesEntry

PropertiesEntry(mapping=None, *, ignore_unknown_fields=False, **kwargs)

The abstract base class for a message.

Parameters
NameDescription
kwargs dict

Keys and values corresponding to the fields of the message.

mapping Union[dict, .Message]

A dictionary or message to be used to determine the values for this message.

ignore_unknown_fields Optional(bool)

If True, do not raise errors for unknown fields. Only applied if mapping is a mapping type or there are keyword parameters.