Optional. HCFS URIs of archives to be extracted into the working directory
of each executor. Supported file types:
.jar, .tar, .tar.gz, .tgz, and .zip.
Optional. The arguments to pass to the Spark driver. Do not include
arguments that can be set as batch properties, such as --conf, since a
collision can occur that causes an incorrect batch submission.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-07 UTC."],[[["\u003cp\u003eThis document provides reference documentation for the \u003ccode\u003eSparkRBatch\u003c/code\u003e class within the Google Cloud Dataproc v1 API, with the latest version being 5.17.0.\u003c/p\u003e\n"],["\u003cp\u003e\u003ccode\u003eSparkRBatch\u003c/code\u003e is a configuration used to run Apache SparkR batch workloads, implementing interfaces such as \u003ccode\u003eIMessage\u003c/code\u003e, \u003ccode\u003eIEquatable\u003c/code\u003e, \u003ccode\u003eIDeepCloneable\u003c/code\u003e, and \u003ccode\u003eIBufferMessage\u003c/code\u003e.\u003c/p\u003e\n"],["\u003cp\u003eThe \u003ccode\u003eSparkRBatch\u003c/code\u003e class includes properties for setting up archive URIs, file URIs, and arguments for the Spark driver, as well as specifying the main R file.\u003c/p\u003e\n"],["\u003cp\u003eThe namespace for this class is \u003ccode\u003eGoogle.Cloud.Dataproc.V1\u003c/code\u003e, and the assembly is \u003ccode\u003eGoogle.Cloud.Dataproc.V1.dll\u003c/code\u003e.\u003c/p\u003e\n"],["\u003cp\u003eThere are many versions of this documentation available, ranging from 3.1.0 to the latest version 5.17.0, all accessible through the links provided.\u003c/p\u003e\n"]]],[],null,[]]