Optional. HCFS URIs of archives to be extracted into the working directory
of each executor. Supported file types:
.jar, .tar, .tar.gz, .tgz, and .zip.
Optional. The arguments to pass to the driver. Do not include arguments
that can be set as batch properties, such as --conf, since a collision
can occur that causes an incorrect batch submission.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-07 UTC."],[[["\u003cp\u003eThe latest version of the Google Cloud Dataproc v1 API for PySparkBatch is 5.17.0, with a variety of previous versions available, down to 3.1.0.\u003c/p\u003e\n"],["\u003cp\u003eThe \u003ccode\u003ePySparkBatch\u003c/code\u003e class is a configuration for running Apache PySpark batch workloads, and it inherits from the \u003ccode\u003eobject\u003c/code\u003e class, implementing several interfaces, such as \u003ccode\u003eIMessage\u003c/code\u003e, \u003ccode\u003eIEquatable\u003c/code\u003e, \u003ccode\u003eIDeepCloneable\u003c/code\u003e, and \u003ccode\u003eIBufferMessage\u003c/code\u003e.\u003c/p\u003e\n"],["\u003cp\u003e\u003ccode\u003ePySparkBatch\u003c/code\u003e has properties to manage files and dependencies for PySpark workloads, including \u003ccode\u003eArchiveUris\u003c/code\u003e, \u003ccode\u003eArgs\u003c/code\u003e, \u003ccode\u003eFileUris\u003c/code\u003e, \u003ccode\u003eJarFileUris\u003c/code\u003e, \u003ccode\u003eMainPythonFileUri\u003c/code\u003e, and \u003ccode\u003ePythonFileUris\u003c/code\u003e.\u003c/p\u003e\n"],["\u003cp\u003eThe \u003ccode\u003eMainPythonFileUri\u003c/code\u003e property, which is a required property, specifies the HCFS URI of the main Python file to use as the Spark driver.\u003c/p\u003e\n"],["\u003cp\u003e\u003ccode\u003eArchiveUris\u003c/code\u003e, \u003ccode\u003eFileUris\u003c/code\u003e, \u003ccode\u003eJarFileUris\u003c/code\u003e, \u003ccode\u003eArgs\u003c/code\u003e, and \u003ccode\u003ePythonFileUris\u003c/code\u003e are all optional properties that take URIs to archives, files, jars, arguments and python files respectively.\u003c/p\u003e\n"]]],[],null,[]]