Optional. HCFS URIs of archives to be extracted into the working directory
of each executor. Supported file types:
.jar, .tar, .tar.gz, .tgz, and .zip.
Optional. The arguments to pass to the Spark driver. Do not include
arguments that can be set as batch properties, such as --conf, since a
collision can occur that causes an incorrect batch submission.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-07 UTC."],[[["\u003cp\u003eThis page provides documentation for the \u003ccode\u003eSparkRBatch\u003c/code\u003e class within the Google Cloud Dataproc v1 API, specifically version 5.11.0, including details on its implementation, inheritance, and properties.\u003c/p\u003e\n"],["\u003cp\u003eThe \u003ccode\u003eSparkRBatch\u003c/code\u003e class is used to configure and run Apache SparkR batch workloads within the Google Cloud Dataproc environment.\u003c/p\u003e\n"],["\u003cp\u003eKey properties of the \u003ccode\u003eSparkRBatch\u003c/code\u003e class include \u003ccode\u003eArchiveUris\u003c/code\u003e, \u003ccode\u003eArgs\u003c/code\u003e, \u003ccode\u003eFileUris\u003c/code\u003e, and \u003ccode\u003eMainRFileUri\u003c/code\u003e, which are used to define the necessary resources and configurations for the SparkR batch job.\u003c/p\u003e\n"],["\u003cp\u003eThe documentation features a list of other available versions ranging from 5.17.0 down to 3.1.0, offering a choice of which version of the API you would like to use.\u003c/p\u003e\n"]]],[],null,[]]