Optional. HCFS URIs of archives to be extracted into the working directory
of each executor. Supported file types:
.jar, .tar, .tar.gz, .tgz, and .zip.
Optional. The arguments to pass to the Spark driver. Do not include
arguments that can be set as batch properties, such as --conf, since a
collision can occur that causes an incorrect batch submission.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-07 UTC."],[[["\u003cp\u003eThis webpage provides reference documentation for the \u003ccode\u003eSparkRBatch\u003c/code\u003e class within the Google Cloud Dataproc v1 API, focusing on version 5.8.0.\u003c/p\u003e\n"],["\u003cp\u003e\u003ccode\u003eSparkRBatch\u003c/code\u003e is used to configure and run Apache SparkR batch workloads, allowing for the specification of arguments, file URIs, and archive URIs.\u003c/p\u003e\n"],["\u003cp\u003eThe documentation includes information on the class's inheritance, implemented interfaces, constructors, properties such as \u003ccode\u003eArchiveUris\u003c/code\u003e, \u003ccode\u003eArgs\u003c/code\u003e, \u003ccode\u003eFileUris\u003c/code\u003e, and \u003ccode\u003eMainRFileUri\u003c/code\u003e, as well as listing all the versions supported ranging from 3.1.0 up to the latest 5.17.0.\u003c/p\u003e\n"],["\u003cp\u003eThe \u003ccode\u003eMainRFileUri\u003c/code\u003e property, specifying the HCFS URI of the main R file, is a required parameter for configuring \u003ccode\u003eSparkRBatch\u003c/code\u003e.\u003c/p\u003e\n"]]],[],null,[]]