public sealed class SmoothGradConfig : IMessage<SmoothGradConfig>, IEquatable<SmoothGradConfig>, IDeepCloneable<SmoothGradConfig>, IBufferMessage, IMessage
Reference documentation and code samples for the Vertex AI v1beta1 API class SmoothGradConfig.
Config for SmoothGrad approximation of gradients.
When enabled, the gradients are approximated by averaging the gradients from
noisy samples in the vicinity of the inputs. Adding noise can help improve
the computed gradients. Refer to this paper for more details:
https://arxiv.org/pdf/1706.03825.pdf
public FeatureNoiseSigma FeatureNoiseSigma { get; set; }
This is similar to
[noise_sigma][google.cloud.aiplatform.v1beta1.SmoothGradConfig.noise_sigma],
but provides additional flexibility. A separate noise sigma can be
provided for each feature, which is useful if their distributions are
different. No noise is added to features that are not set. If this field
is unset,
[noise_sigma][google.cloud.aiplatform.v1beta1.SmoothGradConfig.noise_sigma]
will be used for all features.
This is a single float value and will be used to add noise to all the
features. Use this field when all features are normalized to have the
same distribution: scale to range [0, 1], [-1, 1] or z-scoring, where
features are normalized to have 0-mean and 1-variance. Learn more about
normalization.
For best results the recommended value is about 10% - 20% of the standard
deviation of the input feature. Refer to section 3.2 of the SmoothGrad
paper: https://arxiv.org/pdf/1706.03825.pdf. Defaults to 0.1.
If the distribution is different per feature, set
[feature_noise_sigma][google.cloud.aiplatform.v1beta1.SmoothGradConfig.feature_noise_sigma]
instead for each feature.
The number of gradient samples to use for
approximation. The higher this number, the more accurate the gradient
is, but the runtime complexity increases by this factor as well.
Valid range of its value is [1, 50]. Defaults to 3.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-28 UTC."],[[["\u003cp\u003eSmoothGradConfig is a class in the Vertex AI v1beta1 API used for configuring SmoothGrad, which approximates gradients by averaging gradients from noisy samples.\u003c/p\u003e\n"],["\u003cp\u003eIt implements several interfaces, including IMessage, IEquatable, IDeepCloneable, and IBufferMessage, making it versatile for handling data and message operations.\u003c/p\u003e\n"],["\u003cp\u003eSmoothGradConfig provides options for adding noise to features, either with a single \u003ccode\u003eNoiseSigma\u003c/code\u003e value for all features or using \u003ccode\u003eFeatureNoiseSigma\u003c/code\u003e for per-feature noise customization.\u003c/p\u003e\n"],["\u003cp\u003eThe \u003ccode\u003eNoisySampleCount\u003c/code\u003e property determines the accuracy of the gradient approximation, with a higher count resulting in more accurate gradients but increased runtime complexity, with an available range between 1 and 50.\u003c/p\u003e\n"],["\u003cp\u003eIt has two constructors, one that initialize an empty \u003ccode\u003eSmoothGradConfig\u003c/code\u003e, and one that use another \u003ccode\u003eSmoothGradConfig\u003c/code\u003e instance to initiate its own instance.\u003c/p\u003e\n"]]],[],null,["# Vertex AI v1beta1 API - Class SmoothGradConfig (1.0.0-beta47)\n\nVersion latestkeyboard_arrow_down\n\n- [1.0.0-beta47 (latest)](/dotnet/docs/reference/Google.Cloud.AIPlatform.V1Beta1/latest/Google.Cloud.AIPlatform.V1Beta1.SmoothGradConfig)\n- [1.0.0-beta46](/dotnet/docs/reference/Google.Cloud.AIPlatform.V1Beta1/1.0.0-beta46/Google.Cloud.AIPlatform.V1Beta1.SmoothGradConfig) \n\n public sealed class SmoothGradConfig : IMessage\u003cSmoothGradConfig\u003e, IEquatable\u003cSmoothGradConfig\u003e, IDeepCloneable\u003cSmoothGradConfig\u003e, IBufferMessage, IMessage\n\nReference documentation and code samples for the Vertex AI v1beta1 API class SmoothGradConfig.\n\nConfig for SmoothGrad approximation of gradients.\n\nWhen enabled, the gradients are approximated by averaging the gradients from\nnoisy samples in the vicinity of the inputs. Adding noise can help improve\nthe computed gradients. Refer to this paper for more details:\n\u003chttps://arxiv.org/pdf/1706.03825.pdf\u003e \n\nInheritance\n-----------\n\n[object](https://learn.microsoft.com/dotnet/api/system.object) \\\u003e SmoothGradConfig \n\nImplements\n----------\n\n[IMessage](https://cloud.google.com/dotnet/docs/reference/Google.Protobuf/latest/Google.Protobuf.IMessage-1.html)[SmoothGradConfig](/dotnet/docs/reference/Google.Cloud.AIPlatform.V1Beta1/latest/Google.Cloud.AIPlatform.V1Beta1.SmoothGradConfig), [IEquatable](https://learn.microsoft.com/dotnet/api/system.iequatable-1)[SmoothGradConfig](/dotnet/docs/reference/Google.Cloud.AIPlatform.V1Beta1/latest/Google.Cloud.AIPlatform.V1Beta1.SmoothGradConfig), [IDeepCloneable](https://cloud.google.com/dotnet/docs/reference/Google.Protobuf/latest/Google.Protobuf.IDeepCloneable-1.html)[SmoothGradConfig](/dotnet/docs/reference/Google.Cloud.AIPlatform.V1Beta1/latest/Google.Cloud.AIPlatform.V1Beta1.SmoothGradConfig), [IBufferMessage](https://cloud.google.com/dotnet/docs/reference/Google.Protobuf/latest/Google.Protobuf.IBufferMessage.html), [IMessage](https://cloud.google.com/dotnet/docs/reference/Google.Protobuf/latest/Google.Protobuf.IMessage.html) \n\nInherited Members\n-----------------\n\n[object.GetHashCode()](https://learn.microsoft.com/dotnet/api/system.object.gethashcode) \n[object.GetType()](https://learn.microsoft.com/dotnet/api/system.object.gettype) \n[object.ToString()](https://learn.microsoft.com/dotnet/api/system.object.tostring)\n\nNamespace\n---------\n\n[Google.Cloud.AIPlatform.V1Beta1](/dotnet/docs/reference/Google.Cloud.AIPlatform.V1Beta1/latest/Google.Cloud.AIPlatform.V1Beta1)\n\nAssembly\n--------\n\nGoogle.Cloud.AIPlatform.V1Beta1.dll\n\nConstructors\n------------\n\n### SmoothGradConfig()\n\n public SmoothGradConfig()\n\n### SmoothGradConfig(SmoothGradConfig)\n\n public SmoothGradConfig(SmoothGradConfig other)\n\nProperties\n----------\n\n### FeatureNoiseSigma\n\n public FeatureNoiseSigma FeatureNoiseSigma { get; set; }\n\nThis is similar to\n\\[noise_sigma\\]\\[google.cloud.aiplatform.v1beta1.SmoothGradConfig.noise_sigma\\],\nbut provides additional flexibility. A separate noise sigma can be\nprovided for each feature, which is useful if their distributions are\ndifferent. No noise is added to features that are not set. If this field\nis unset,\n\\[noise_sigma\\]\\[google.cloud.aiplatform.v1beta1.SmoothGradConfig.noise_sigma\\]\nwill be used for all features.\n\n### GradientNoiseSigmaCase\n\n public SmoothGradConfig.GradientNoiseSigmaOneofCase GradientNoiseSigmaCase { get; }\n\n### HasNoiseSigma\n\n public bool HasNoiseSigma { get; }\n\nGets whether the \"noise_sigma\" field is set\n\n### NoiseSigma\n\n public float NoiseSigma { get; set; }\n\nThis is a single float value and will be used to add noise to all the\nfeatures. Use this field when all features are normalized to have the\nsame distribution: scale to range \\[0, 1\\], \\[-1, 1\\] or z-scoring, where\nfeatures are normalized to have 0-mean and 1-variance. Learn more about\n[normalization](https://developers.google.com/machine-learning/data-prep/transform/normalization).\n\nFor best results the recommended value is about 10% - 20% of the standard\ndeviation of the input feature. Refer to section 3.2 of the SmoothGrad\npaper: \u003chttps://arxiv.org/pdf/1706.03825.pdf\u003e. Defaults to 0.1.\n\nIf the distribution is different per feature, set\n\\[feature_noise_sigma\\]\\[google.cloud.aiplatform.v1beta1.SmoothGradConfig.feature_noise_sigma\\]\ninstead for each feature.\n\n### NoisySampleCount\n\n public int NoisySampleCount { get; set; }\n\nThe number of gradient samples to use for\napproximation. The higher this number, the more accurate the gradient\nis, but the runtime complexity increases by this factor as well.\nValid range of its value is \\[1, 50\\]. Defaults to 3."]]