Optional. HCFS URIs of archives to be extracted into the working directory
of each executor. Supported file types:
.jar, .tar, .tar.gz, .tgz, and .zip.
Optional. The arguments to pass to the driver. Do not include arguments
that can be set as batch properties, such as --conf, since a collision
can occur that causes an incorrect batch submission.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-07 UTC."],[[["\u003cp\u003eThis webpage provides reference documentation for the \u003ccode\u003ePySparkBatch\u003c/code\u003e class within the Google Cloud Dataproc v1 API, focusing on version 5.14.0, but also offering links to other versions, from 3.1.0 to 5.17.0, including the latest.\u003c/p\u003e\n"],["\u003cp\u003e\u003ccode\u003ePySparkBatch\u003c/code\u003e is a configuration class used for running Apache PySpark batch workloads and can be implemented using the provided \u003ccode\u003eIMessage\u003c/code\u003e, \u003ccode\u003eIEquatable\u003c/code\u003e, \u003ccode\u003eIDeepCloneable\u003c/code\u003e, and \u003ccode\u003eIBufferMessage\u003c/code\u003e interfaces.\u003c/p\u003e\n"],["\u003cp\u003eThe \u003ccode\u003ePySparkBatch\u003c/code\u003e class includes properties like \u003ccode\u003eArchiveUris\u003c/code\u003e, \u003ccode\u003eArgs\u003c/code\u003e, \u003ccode\u003eFileUris\u003c/code\u003e, \u003ccode\u003eJarFileUris\u003c/code\u003e, \u003ccode\u003eMainPythonFileUri\u003c/code\u003e, and \u003ccode\u003ePythonFileUris\u003c/code\u003e, which are used to specify various file paths and arguments for the PySpark driver and executors.\u003c/p\u003e\n"],["\u003cp\u003eThe main function of \u003ccode\u003ePySparkBatch\u003c/code\u003e is to set the configurations necessary for executing PySpark applications, including specifying the main Python file, associated dependencies, and any required arguments.\u003c/p\u003e\n"],["\u003cp\u003eThe document outlines the constructors and inheritance details, implementing members, and the various properties available for the \u003ccode\u003ePySparkBatch\u003c/code\u003e class.\u003c/p\u003e\n"]]],[],null,[]]