public string ClusterManagerApiService { get; set; }
The type of cluster manager API to use. If unknown or
unspecified, the service will attempt to choose a reasonable
default. This should be in the form of the API service name,
e.g. "compute.googleapis.com".
The list of experiments to enable. This field should be used for SDK
related experiments and not for service related experiments. The proper
field for service related experiments is service_options.
The Cloud Dataflow SDK pipeline options specified by the user. These
options are passed through the service and are used to recreate the
SDK pipeline options on the worker in a language agnostic and platform
independent way.
public RepeatedField<string> ServiceOptions { get; }
The list of service options to enable. This field should be used for
service related experiments only. These experiments, when graduating to GA,
should be replaced by dedicated fields or become default (i.e. always on).
The prefix of the resources the system should use for temporary
storage. The system will append the suffix "/temp-{JOBNAME} to
this resource prefix, where {JOBNAME} is the value of the
job_name field. The resulting bucket and object prefix is used
as the prefix of the resources used to store temporary data
needed during the job execution. NOTE: This will override the
value in taskrunner_settings.
The supported resource type is:
The Compute Engine region
(https://cloud.google.com/compute/docs/regions-zones/regions-zones) in
which worker processing should occur, e.g. "us-west1". Mutually exclusive
with worker_zone. If neither worker_region nor worker_zone is specified,
default to the control plane's region.
The Compute Engine zone
(https://cloud.google.com/compute/docs/regions-zones/regions-zones) in
which worker processing should occur, e.g. "us-west1-a". Mutually exclusive
with worker_region. If neither worker_region nor worker_zone is specified,
a zone in the control plane's region is chosen based on available capacity.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-07 UTC."],[[["\u003cp\u003eThe \u003ccode\u003eEnvironment\u003c/code\u003e class in the Dataflow v1beta3 API describes the runtime environment for a Dataflow job, including settings for cluster management, datasets, and debugging.\u003c/p\u003e\n"],["\u003cp\u003eThis class allows for configuring job properties such as the API service to use, the dataset for workflow-related tables, and resource scheduling goals.\u003c/p\u003e\n"],["\u003cp\u003eIt supports various experimental settings, including enabling service and SDK experiments, managing shuffle modes, and specifying temporary storage prefixes.\u003c/p\u003e\n"],["\u003cp\u003eThe \u003ccode\u003eEnvironment\u003c/code\u003e class enables the definition of worker pool configurations, including regions and zones for worker processing.\u003c/p\u003e\n"],["\u003cp\u003eThe class is available in various versions, with the most recent version being 2.0.0-beta07.\u003c/p\u003e\n"]]],[],null,["# Dataflow v1beta3 API - Class Environment (2.0.0-beta07)\n\nVersion latestkeyboard_arrow_down\n\n- [2.0.0-beta07 (latest)](/dotnet/docs/reference/Google.Cloud.Dataflow.V1Beta3/latest/Google.Cloud.Dataflow.V1Beta3.Environment)\n- [2.0.0-beta06](/dotnet/docs/reference/Google.Cloud.Dataflow.V1Beta3/2.0.0-beta06/Google.Cloud.Dataflow.V1Beta3.Environment)\n- [1.0.0-beta03](/dotnet/docs/reference/Google.Cloud.Dataflow.V1Beta3/1.0.0-beta03/Google.Cloud.Dataflow.V1Beta3.Environment) \n\n public sealed class Environment : IMessage\u003cEnvironment\u003e, IEquatable\u003cEnvironment\u003e, IDeepCloneable\u003cEnvironment\u003e, IBufferMessage, IMessage\n\nReference documentation and code samples for the Dataflow v1beta3 API class Environment.\n\nDescribes the environment in which a Dataflow Job runs. \n\nInheritance\n-----------\n\n[object](https://learn.microsoft.com/dotnet/api/system.object) \\\u003e Environment \n\nImplements\n----------\n\n[IMessage](https://cloud.google.com/dotnet/docs/reference/Google.Protobuf/latest/Google.Protobuf.IMessage-1.html)[Environment](/dotnet/docs/reference/Google.Cloud.Dataflow.V1Beta3/latest/Google.Cloud.Dataflow.V1Beta3.Environment), [IEquatable](https://learn.microsoft.com/dotnet/api/system.iequatable-1)[Environment](/dotnet/docs/reference/Google.Cloud.Dataflow.V1Beta3/latest/Google.Cloud.Dataflow.V1Beta3.Environment), [IDeepCloneable](https://cloud.google.com/dotnet/docs/reference/Google.Protobuf/latest/Google.Protobuf.IDeepCloneable-1.html)[Environment](/dotnet/docs/reference/Google.Cloud.Dataflow.V1Beta3/latest/Google.Cloud.Dataflow.V1Beta3.Environment), [IBufferMessage](https://cloud.google.com/dotnet/docs/reference/Google.Protobuf/latest/Google.Protobuf.IBufferMessage.html), [IMessage](https://cloud.google.com/dotnet/docs/reference/Google.Protobuf/latest/Google.Protobuf.IMessage.html) \n\nInherited Members\n-----------------\n\n[object.GetHashCode()](https://learn.microsoft.com/dotnet/api/system.object.gethashcode) \n[object.GetType()](https://learn.microsoft.com/dotnet/api/system.object.gettype) \n[object.ToString()](https://learn.microsoft.com/dotnet/api/system.object.tostring)\n\nNamespace\n---------\n\n[Google.Cloud.Dataflow.V1Beta3](/dotnet/docs/reference/Google.Cloud.Dataflow.V1Beta3/latest/Google.Cloud.Dataflow.V1Beta3)\n\nAssembly\n--------\n\nGoogle.Cloud.Dataflow.V1Beta3.dll\n\nConstructors\n------------\n\n### Environment()\n\n public Environment()\n\n### Environment(Environment)\n\n public Environment(Environment other)\n\nProperties\n----------\n\n### ClusterManagerApiService\n\n public string ClusterManagerApiService { get; set; }\n\nThe type of cluster manager API to use. If unknown or\nunspecified, the service will attempt to choose a reasonable\ndefault. This should be in the form of the API service name,\ne.g. \"compute.googleapis.com\".\n\n### Dataset\n\n public string Dataset { get; set; }\n\nThe dataset for the current project where various workflow\nrelated tables are stored.\n\nThe supported resource type is:\n\nGoogle BigQuery:\nbigquery.googleapis.com/{dataset}\n\n### DebugOptions\n\n public DebugOptions DebugOptions { get; set; }\n\nAny debugging options to be supplied to the job.\n\n### Experiments\n\n public RepeatedField\u003cstring\u003e Experiments { get; }\n\nThe list of experiments to enable. This field should be used for SDK\nrelated experiments and not for service related experiments. The proper\nfield for service related experiments is service_options.\n\n### FlexResourceSchedulingGoal\n\n public FlexResourceSchedulingGoal FlexResourceSchedulingGoal { get; set; }\n\nWhich Flexible Resource Scheduling mode to run in.\n\n### InternalExperiments\n\n public Any InternalExperiments { get; set; }\n\nExperimental settings.\n\n### SdkPipelineOptions\n\n public Struct SdkPipelineOptions { get; set; }\n\nThe Cloud Dataflow SDK pipeline options specified by the user. These\noptions are passed through the service and are used to recreate the\nSDK pipeline options on the worker in a language agnostic and platform\nindependent way.\n\n### ServiceAccountEmail\n\n public string ServiceAccountEmail { get; set; }\n\nIdentity to run virtual machines as. Defaults to the default account.\n\n### ServiceKmsKeyName\n\n public string ServiceKmsKeyName { get; set; }\n\nIf set, contains the Cloud KMS key identifier used to encrypt data\nat rest, AKA a Customer Managed Encryption Key (CMEK).\n\nFormat:\nprojects/PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY\n\n### ServiceOptions\n\n public RepeatedField\u003cstring\u003e ServiceOptions { get; }\n\nThe list of service options to enable. This field should be used for\nservice related experiments only. These experiments, when graduating to GA,\nshould be replaced by dedicated fields or become default (i.e. always on).\n\n### ShuffleMode\n\n public ShuffleMode ShuffleMode { get; set; }\n\nOutput only. The shuffle mode used for the job.\n\n### TempStoragePrefix\n\n public string TempStoragePrefix { get; set; }\n\nThe prefix of the resources the system should use for temporary\nstorage. The system will append the suffix \"/temp-{JOBNAME} to\nthis resource prefix, where {JOBNAME} is the value of the\njob_name field. The resulting bucket and object prefix is used\nas the prefix of the resources used to store temporary data\nneeded during the job execution. NOTE: This will override the\nvalue in taskrunner_settings.\nThe supported resource type is:\n\nGoogle Cloud Storage:\n\nstorage.googleapis.com/{bucket}/{object}\nbucket.storage.googleapis.com/{object}\n\n### UserAgent\n\n public Struct UserAgent { get; set; }\n\nA description of the process that generated the request.\n\n### Version\n\n public Struct Version { get; set; }\n\nA structure describing which components and their versions of the service\nare required in order to run the job.\n\n### WorkerPools\n\n public RepeatedField\u003cWorkerPool\u003e WorkerPools { get; }\n\nThe worker pools. At least one \"harness\" worker pool must be\nspecified in order for the job to have workers.\n\n### WorkerRegion\n\n public string WorkerRegion { get; set; }\n\nThe Compute Engine region\n(\u003chttps://cloud.google.com/compute/docs/regions-zones/regions-zones\u003e) in\nwhich worker processing should occur, e.g. \"us-west1\". Mutually exclusive\nwith worker_zone. If neither worker_region nor worker_zone is specified,\ndefault to the control plane's region.\n\n### WorkerZone\n\n public string WorkerZone { get; set; }\n\nThe Compute Engine zone\n(\u003chttps://cloud.google.com/compute/docs/regions-zones/regions-zones\u003e) in\nwhich worker processing should occur, e.g. \"us-west1-a\". Mutually exclusive\nwith worker_region. If neither worker_region nor worker_zone is specified,\na zone in the control plane's region is chosen based on available capacity."]]