The following sections discuss how Cloud Armor interacts with other Trusted Cloud by S3NS features and products.
Cloud Armor and VPC firewall rules
Cloud Armor security policies and VPC firewall rules have different functions:
- Cloud Armor security policies provide edge security and act on client traffic to Google Front Ends (GFEs).
- VPC firewall rules allow or deny traffic to and from your backends. You must create ingress allow firewall rules, whose targets are the load-balanced backend VMs, and whose sources are IP ranges used by global external Application Load Balancers or classic Application Load Balancers. These rules allow GFEs and the health check systems to communicate with your backend VMs.
For example, consider a scenario in which you want to allow traffic only from CIDR range 100.1.1.0/24 and CIDR range 100.1.2.0/24 to access your global external Application Load Balancer or classic Application Load Balancer. Your goal is to block traffic from directly reaching the backend load balanced instances. In other words, only external traffic proxied through the global external Application Load Balancer or the classic Application Load Balancer with an associated security policy can reach the instances.
The previous diagram shows the following deployment configuration:
- Create two instance groups, one in the
us-west1
region and another in theeurope-west1
region. - Deploy backend application instances to the VMs in the instance groups.
- Create a global external Application Load Balancer or a classic Application Load Balancer in Premium
Tier. Configure a URL map
and a single backend service whose backends are the two instance groups
that you created in the previous step. The load balancer's
forwarding rule must use the
120.1.1.1
external IP address. - Configure a Cloud Armor security policy that allows traffic from 100.1.1.0/24 and 100.1.2.0/24 and denies all other traffic.
- Associate this policy with the load balancer's backend service. For instructions, see Configure Cloud Armor security policies. External HTTP(S) load balancers with more complex URL maps can reference multiple backend services. You can associate the security policy with one or more of the backend services as needed.
- Configure ingress allow firewall rules to permit traffic from the global external Application Load Balancer or the classic Application Load Balancer. For more information, see Firewall rules.
Cloud Armor with Cloud Run, App Engine, or Cloud Run functions
You can use Cloud Armor security policies with a serverless NEG backend that points to a Cloud Run, App Engine, or Cloud Run functions service.
However, when you use Cloud Armor with serverless NEGs, Cloud Run, or Cloud Run functions, all access to the serverless endpoint must be filtered through a Cloud Armor security policy.
Users who have the default URL for a serverless application can bypass the load balancer and go directly to the service URL. This bypasses Cloud Armor security policies. To address this, disable the default URL that Trusted Cloud automatically assigns to Cloud Run services or Cloud Run functions (2nd gen) functions. To protect App Engine applications, you can use ingress controls.
If you're using ingress controls to apply your access controls to all incoming
traffic, you can use the internal-and-gclb
ingress setting when you configure
Cloud Run functions
or Cloud Run.
The internal-and-gclb
ingress setting allows only internal traffic and
traffic sent to an external IP address exposed by the
global external Application Load Balancer or the classic Application Load Balancer. Traffic that is
sent to these default URLs from outside of your private network is blocked.
This prevents users from circumventing any
access controls (such as Cloud Armor security policies) set up
through the global external Application Load Balancer or classic Application Load Balancer.
For more information about serverless NEGs, see Serverless network endpoint groups overview and Setting up serverless NEGs.
Cloud Armor with Cloud Service Mesh
You can configure internal service security policies for your service mesh to enforce global server-side rate limiting per client, helping you fairly share your service's available capacity and mitigating the risk of malicious or misbehaving clients overloading your services. You attach a security policy to a Cloud Service Mesh endpoint policy to enforce rate limiting on inbound traffic on the server-side. However, you can't configure a Google Cloud Armor security policy if you are using TCP traffic routing. For more information about using Cloud Armor with Cloud Service Mesh, see Configure rate limiting with Cloud Armor.