Optional. HCFS URIs of archives to be extracted into the working directory
of each executor. Supported file types:
.jar, .tar, .tar.gz, .tgz, and .zip.
Optional. The arguments to pass to the driver. Do not include arguments,
such as --conf, that can be set as job properties, since a collision may
occur that causes an incorrect job submission.
public MapField<string, string> Properties { get; }
Optional. A mapping of property names to values, used to configure SparkR.
Properties that conflict with values set by the Dataproc API might be
overwritten. Can include properties set in
/etc/spark/conf/spark-defaults.conf and classes in user code.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-07 UTC."],[[["\u003cp\u003eThis document provides reference documentation for the \u003ccode\u003eSparkRJob\u003c/code\u003e class within the Google Cloud Dataproc v1 API, offering details on various versions, with version 5.17.0 being the latest.\u003c/p\u003e\n"],["\u003cp\u003eThe \u003ccode\u003eSparkRJob\u003c/code\u003e class is used for running Apache SparkR applications on YARN, and implements interfaces like \u003ccode\u003eIMessage\u003c/code\u003e, \u003ccode\u003eIEquatable\u003c/code\u003e, \u003ccode\u003eIDeepCloneable\u003c/code\u003e, and \u003ccode\u003eIBufferMessage\u003c/code\u003e.\u003c/p\u003e\n"],["\u003cp\u003eThe \u003ccode\u003eSparkRJob\u003c/code\u003e class has properties such as \u003ccode\u003eArchiveUris\u003c/code\u003e, \u003ccode\u003eArgs\u003c/code\u003e, \u003ccode\u003eFileUris\u003c/code\u003e, \u003ccode\u003eLoggingConfig\u003c/code\u003e, \u003ccode\u003eMainRFileUri\u003c/code\u003e, and \u003ccode\u003eProperties\u003c/code\u003e, allowing for configuration of SparkR jobs, including specifying driver files, arguments, and dependencies.\u003c/p\u003e\n"],["\u003cp\u003eThe documentation outlines the inheritance hierarchy of \u003ccode\u003eSparkRJob\u003c/code\u003e, showing that it inherits from \u003ccode\u003eobject\u003c/code\u003e and lists the inherited members such as \u003ccode\u003eGetHashCode\u003c/code\u003e, \u003ccode\u003eGetType\u003c/code\u003e, and \u003ccode\u003eToString\u003c/code\u003e.\u003c/p\u003e\n"],["\u003cp\u003eThere is a list of versioned pages, from 5.17.0 all the way back to 3.1.0, which are all linked within the content.\u003c/p\u003e\n"]]],[],null,["# Google Cloud Dataproc v1 API - Class SparkRJob (5.20.0)\n\nVersion latestkeyboard_arrow_down\n\n- [5.20.0 (latest)](/dotnet/docs/reference/Google.Cloud.Dataproc.V1/latest/Google.Cloud.Dataproc.V1.SparkRJob)\n- [5.19.0](/dotnet/docs/reference/Google.Cloud.Dataproc.V1/5.19.0/Google.Cloud.Dataproc.V1.SparkRJob)\n- [5.18.0](/dotnet/docs/reference/Google.Cloud.Dataproc.V1/5.18.0/Google.Cloud.Dataproc.V1.SparkRJob)\n- [5.17.0](/dotnet/docs/reference/Google.Cloud.Dataproc.V1/5.17.0/Google.Cloud.Dataproc.V1.SparkRJob)\n- [5.16.0](/dotnet/docs/reference/Google.Cloud.Dataproc.V1/5.16.0/Google.Cloud.Dataproc.V1.SparkRJob)\n- [5.15.0](/dotnet/docs/reference/Google.Cloud.Dataproc.V1/5.15.0/Google.Cloud.Dataproc.V1.SparkRJob)\n- [5.14.0](/dotnet/docs/reference/Google.Cloud.Dataproc.V1/5.14.0/Google.Cloud.Dataproc.V1.SparkRJob)\n- [5.13.0](/dotnet/docs/reference/Google.Cloud.Dataproc.V1/5.13.0/Google.Cloud.Dataproc.V1.SparkRJob)\n- [5.12.0](/dotnet/docs/reference/Google.Cloud.Dataproc.V1/5.12.0/Google.Cloud.Dataproc.V1.SparkRJob)\n- [5.11.0](/dotnet/docs/reference/Google.Cloud.Dataproc.V1/5.11.0/Google.Cloud.Dataproc.V1.SparkRJob)\n- [5.10.0](/dotnet/docs/reference/Google.Cloud.Dataproc.V1/5.10.0/Google.Cloud.Dataproc.V1.SparkRJob)\n- [5.9.0](/dotnet/docs/reference/Google.Cloud.Dataproc.V1/5.9.0/Google.Cloud.Dataproc.V1.SparkRJob)\n- [5.8.0](/dotnet/docs/reference/Google.Cloud.Dataproc.V1/5.8.0/Google.Cloud.Dataproc.V1.SparkRJob)\n- [5.7.0](/dotnet/docs/reference/Google.Cloud.Dataproc.V1/5.7.0/Google.Cloud.Dataproc.V1.SparkRJob)\n- [5.6.0](/dotnet/docs/reference/Google.Cloud.Dataproc.V1/5.6.0/Google.Cloud.Dataproc.V1.SparkRJob)\n- [5.5.0](/dotnet/docs/reference/Google.Cloud.Dataproc.V1/5.5.0/Google.Cloud.Dataproc.V1.SparkRJob)\n- [5.4.0](/dotnet/docs/reference/Google.Cloud.Dataproc.V1/5.4.0/Google.Cloud.Dataproc.V1.SparkRJob)\n- [5.3.0](/dotnet/docs/reference/Google.Cloud.Dataproc.V1/5.3.0/Google.Cloud.Dataproc.V1.SparkRJob)\n- [5.2.0](/dotnet/docs/reference/Google.Cloud.Dataproc.V1/5.2.0/Google.Cloud.Dataproc.V1.SparkRJob)\n- [5.1.0](/dotnet/docs/reference/Google.Cloud.Dataproc.V1/5.1.0/Google.Cloud.Dataproc.V1.SparkRJob)\n- [5.0.0](/dotnet/docs/reference/Google.Cloud.Dataproc.V1/5.0.0/Google.Cloud.Dataproc.V1.SparkRJob)\n- [4.0.0](/dotnet/docs/reference/Google.Cloud.Dataproc.V1/4.0.0/Google.Cloud.Dataproc.V1.SparkRJob)\n- [3.4.0](/dotnet/docs/reference/Google.Cloud.Dataproc.V1/3.4.0/Google.Cloud.Dataproc.V1.SparkRJob)\n- [3.3.0](/dotnet/docs/reference/Google.Cloud.Dataproc.V1/3.3.0/Google.Cloud.Dataproc.V1.SparkRJob)\n- [3.2.0](/dotnet/docs/reference/Google.Cloud.Dataproc.V1/3.2.0/Google.Cloud.Dataproc.V1.SparkRJob)\n- [3.1.0](/dotnet/docs/reference/Google.Cloud.Dataproc.V1/3.1.0/Google.Cloud.Dataproc.V1.SparkRJob) \n\n public sealed class SparkRJob : IMessage\u003cSparkRJob\u003e, IEquatable\u003cSparkRJob\u003e, IDeepCloneable\u003cSparkRJob\u003e, IBufferMessage, IMessage\n\nReference documentation and code samples for the Google Cloud Dataproc v1 API class SparkRJob.\n\nA Dataproc job for running\n[Apache SparkR](https://spark.apache.org/docs/latest/sparkr.html)\napplications on YARN. \n\nInheritance\n-----------\n\n[object](https://learn.microsoft.com/dotnet/api/system.object) \\\u003e SparkRJob \n\nImplements\n----------\n\n[IMessage](https://cloud.google.com/dotnet/docs/reference/Google.Protobuf/latest/Google.Protobuf.IMessage-1.html)[SparkRJob](/dotnet/docs/reference/Google.Cloud.Dataproc.V1/latest/Google.Cloud.Dataproc.V1.SparkRJob), [IEquatable](https://learn.microsoft.com/dotnet/api/system.iequatable-1)[SparkRJob](/dotnet/docs/reference/Google.Cloud.Dataproc.V1/latest/Google.Cloud.Dataproc.V1.SparkRJob), [IDeepCloneable](https://cloud.google.com/dotnet/docs/reference/Google.Protobuf/latest/Google.Protobuf.IDeepCloneable-1.html)[SparkRJob](/dotnet/docs/reference/Google.Cloud.Dataproc.V1/latest/Google.Cloud.Dataproc.V1.SparkRJob), [IBufferMessage](https://cloud.google.com/dotnet/docs/reference/Google.Protobuf/latest/Google.Protobuf.IBufferMessage.html), [IMessage](https://cloud.google.com/dotnet/docs/reference/Google.Protobuf/latest/Google.Protobuf.IMessage.html) \n\nInherited Members\n-----------------\n\n[object.GetHashCode()](https://learn.microsoft.com/dotnet/api/system.object.gethashcode) \n[object.GetType()](https://learn.microsoft.com/dotnet/api/system.object.gettype) \n[object.ToString()](https://learn.microsoft.com/dotnet/api/system.object.tostring)\n\nNamespace\n---------\n\n[Google.Cloud.Dataproc.V1](/dotnet/docs/reference/Google.Cloud.Dataproc.V1/latest/Google.Cloud.Dataproc.V1)\n\nAssembly\n--------\n\nGoogle.Cloud.Dataproc.V1.dll\n\nConstructors\n------------\n\n### SparkRJob()\n\n public SparkRJob()\n\n### SparkRJob(SparkRJob)\n\n public SparkRJob(SparkRJob other)\n\nProperties\n----------\n\n### ArchiveUris\n\n public RepeatedField\u003cstring\u003e ArchiveUris { get; }\n\nOptional. HCFS URIs of archives to be extracted into the working directory\nof each executor. Supported file types:\n.jar, .tar, .tar.gz, .tgz, and .zip.\n\n### Args\n\n public RepeatedField\u003cstring\u003e Args { get; }\n\nOptional. The arguments to pass to the driver. Do not include arguments,\nsuch as `--conf`, that can be set as job properties, since a collision may\noccur that causes an incorrect job submission.\n\n### FileUris\n\n public RepeatedField\u003cstring\u003e FileUris { get; }\n\nOptional. HCFS URIs of files to be placed in the working directory of\neach executor. Useful for naively parallel tasks.\n\n### LoggingConfig\n\n public LoggingConfig LoggingConfig { get; set; }\n\nOptional. The runtime log config for job execution.\n\n### MainRFileUri\n\n public string MainRFileUri { get; set; }\n\nRequired. The HCFS URI of the main R file to use as the driver.\nMust be a .R file.\n\n### Properties\n\n public MapField\u003cstring, string\u003e Properties { get; }\n\nOptional. A mapping of property names to values, used to configure SparkR.\nProperties that conflict with values set by the Dataproc API might be\noverwritten. Can include properties set in\n/etc/spark/conf/spark-defaults.conf and classes in user code."]]