Reference documentation and code samples for the Google Cloud Dataproc v1 API enum GkeNodePoolTarget.Types.Role.
Role specifies the tasks that will run on the node pool. Roles can be
specific to workloads. Exactly one
[GkeNodePoolTarget][google.cloud.dataproc.v1.GkeNodePoolTarget] within the
virtual cluster must have the DEFAULT role, which is used to run all
workloads that are not associated with a node pool.
Run work associated with the Dataproc control plane (for example,
controllers and webhooks). Very low resource requirements.
Default
At least one node pool must have the DEFAULT role.
Work assigned to a role that is not associated with a node pool
is assigned to the node pool with the DEFAULT role. For example,
work assigned to the CONTROLLER role will be assigned to the node pool
with the DEFAULT role if no node pool has the CONTROLLER role.
SparkDriver
Run work associated with a Spark driver of a job.
SparkExecutor
Run work associated with a Spark executor of a job.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-07 UTC."],[[["\u003cp\u003eThe documentation covers various versions of the Google Cloud Dataproc v1 API, ranging from version 3.1.0 to the latest version 5.17.0, all related to \u003ccode\u003eGkeNodePoolTarget.Types.Role\u003c/code\u003e.\u003c/p\u003e\n"],["\u003cp\u003e\u003ccode\u003eGkeNodePoolTarget.Types.Role\u003c/code\u003e defines the specific roles that can be assigned to a node pool within the Dataproc virtual cluster, and exactly one node pool in the virtual cluster must be set to the \u003ccode\u003eDEFAULT\u003c/code\u003e role.\u003c/p\u003e\n"],["\u003cp\u003eThere are five defined roles within \u003ccode\u003eGkeNodePoolTarget.Types.Role\u003c/code\u003e: \u003ccode\u003eController\u003c/code\u003e, \u003ccode\u003eDefault\u003c/code\u003e, \u003ccode\u003eSparkDriver\u003c/code\u003e, \u003ccode\u003eSparkExecutor\u003c/code\u003e, and \u003ccode\u003eUnspecified\u003c/code\u003e, each serving a unique function in task allocation.\u003c/p\u003e\n"],["\u003cp\u003eThe \u003ccode\u003eDEFAULT\u003c/code\u003e role is critical as it handles workloads that are not assigned to any specific node pool role.\u003c/p\u003e\n"],["\u003cp\u003eThe \u003ccode\u003eController\u003c/code\u003e role is designed for Dataproc control plane tasks and has minimal resource needs, while the \u003ccode\u003eSparkDriver\u003c/code\u003e and \u003ccode\u003eSparkExecutor\u003c/code\u003e roles are specifically allocated for Spark job processes.\u003c/p\u003e\n"]]],[],null,[]]