ExamGecko
Home Home / Google / Professional Cloud Security Engineer

Google Professional Cloud Security Engineer Practice Test - Questions Answers, Page 9

Question list
Search
Search

List of questions

Search

Related questions











A customer is running an analytics workload on Google Cloud Platform (GCP) where Compute Engine instances are accessing data stored on Cloud Storage. Your team wants to make sure that this workload will not be able to access, or be accessed from, the internet.

Which two strategies should your team use to meet these requirements? (Choose two.)

A.
Configure Private Google Access on the Compute Engine subnet
A.
Configure Private Google Access on the Compute Engine subnet
Answers
B.
Avoid assigning public IP addresses to the Compute Engine cluster.
B.
Avoid assigning public IP addresses to the Compute Engine cluster.
Answers
C.
Make sure that the Compute Engine cluster is running on a separate subnet.
C.
Make sure that the Compute Engine cluster is running on a separate subnet.
Answers
D.
Turn off IP forwarding on the Compute Engine instances in the cluster.
D.
Turn off IP forwarding on the Compute Engine instances in the cluster.
Answers
E.
Configure a Cloud NAT gateway.
E.
Configure a Cloud NAT gateway.
Answers
Suggested answer: A, B

A customer wants to run a batch processing system on VMs and store the output files in a Cloud Storage bucket. The networking and security teams have decided that no VMs may reach the public internet.

How should this be accomplished?

A.
Create a firewall rule to block internet traffic from the VM.
A.
Create a firewall rule to block internet traffic from the VM.
Answers
B.
Provision a NAT Gateway to access the Cloud Storage API endpoint.
B.
Provision a NAT Gateway to access the Cloud Storage API endpoint.
Answers
C.
Enable Private Google Access on the VPC.
C.
Enable Private Google Access on the VPC.
Answers
D.
Mount a Cloud Storage bucket as a local filesystem on every VM.
D.
Mount a Cloud Storage bucket as a local filesystem on every VM.
Answers
Suggested answer: C

Explanation:

https://cloud.google.com/vpc/docs/private-google-access

As adoption of the Cloud Data Loss Prevention (DLP) API grows within the company, you need to optimize usage to reduce cost. DLP target data is stored in Cloud Storage and BigQuery. The location and region are identified as a suffix in the resource name.

Which cost reduction options should you recommend?

A.
Set appropriate rowsLimit value on BigQuery data hosted outside the US and set appropriate bytesLimitPerFile value on multiregional Cloud Storage buckets.
A.
Set appropriate rowsLimit value on BigQuery data hosted outside the US and set appropriate bytesLimitPerFile value on multiregional Cloud Storage buckets.
Answers
B.
Set appropriate rowsLimit value on BigQuery data hosted outside the US, and minimize transformation units on multiregional Cloud Storage buckets.
B.
Set appropriate rowsLimit value on BigQuery data hosted outside the US, and minimize transformation units on multiregional Cloud Storage buckets.
Answers
C.
Use rowsLimit and bytesLimitPerFile to sample data and use CloudStorageRegexFileSet to limit scans.
C.
Use rowsLimit and bytesLimitPerFile to sample data and use CloudStorageRegexFileSet to limit scans.
Answers
D.
Use FindingLimits and TimespanContfig to sample data and minimize transformation units.
D.
Use FindingLimits and TimespanContfig to sample data and minimize transformation units.
Answers
Suggested answer: C

Explanation:

https://cloud.google.com/dlp/docs/inspecting-storage#sampling https://cloud.google.com/dlp/docs/best-practices-costs#limit_scans_of_files_in_to_only_relevant_files

Your team uses a service account to authenticate data transfers from a given Compute Engine virtual machine instance of to a specified Cloud Storage bucket. An engineer accidentally deletes the service account, which breaks application functionality. You want to recover the application as quickly as possible without compromising security.

What should you do?

A.
Temporarily disable authentication on the Cloud Storage bucket.
A.
Temporarily disable authentication on the Cloud Storage bucket.
Answers
B.
Use the undelete command to recover the deleted service account.
B.
Use the undelete command to recover the deleted service account.
Answers
C.
Create a new service account with the same name as the deleted service account.
C.
Create a new service account with the same name as the deleted service account.
Answers
D.
Update the permissions of another existing service account and supply those credentials to the applications.
D.
Update the permissions of another existing service account and supply those credentials to the applications.
Answers
Suggested answer: B

Explanation:

https://cloud.google.com/iam/docs/reference/rest/v1/projects.serviceAccounts/undelete

You are the Security Admin in your company. You want to synchronize all security groups that have an email address from your LDAP directory in Cloud IAM.

What should you do?

A.
Configure Google Cloud Directory Sync to sync security groups using LDAP search rules that have ''user email address'' as the attribute to facilitate one-way sync.
A.
Configure Google Cloud Directory Sync to sync security groups using LDAP search rules that have ''user email address'' as the attribute to facilitate one-way sync.
Answers
B.
Configure Google Cloud Directory Sync to sync security groups using LDAP search rules that have ''user email address'' as the attribute to facilitate bidirectional sync.
B.
Configure Google Cloud Directory Sync to sync security groups using LDAP search rules that have ''user email address'' as the attribute to facilitate bidirectional sync.
Answers
C.
Use a management tool to sync the subset based on the email address attribute. Create a group in the Google domain. A group created in a Google domain will automatically have an explicit Google Cloud Identity and Access Management (IAM) role.
C.
Use a management tool to sync the subset based on the email address attribute. Create a group in the Google domain. A group created in a Google domain will automatically have an explicit Google Cloud Identity and Access Management (IAM) role.
Answers
D.
Use a management tool to sync the subset based on group object class attribute. Create a group in the Google domain. A group created in a Google domain will automatically have an explicit Google Cloud Identity and Access Management (IAM) role.
D.
Use a management tool to sync the subset based on group object class attribute. Create a group in the Google domain. A group created in a Google domain will automatically have an explicit Google Cloud Identity and Access Management (IAM) role.
Answers
Suggested answer: A

Explanation:

search rules that have 'user email address' as the attribute to facilitate one-way sync. Reference Links: https://support.google.com/a/answer/6126589?hl=en

You are part of a security team investigating a compromised service account key. You need to audit which new resources were created by the service account.

What should you do?

A.
Query Data Access logs.
A.
Query Data Access logs.
Answers
B.
Query Admin Activity logs.
B.
Query Admin Activity logs.
Answers
C.
Query Access Transparency logs.
C.
Query Access Transparency logs.
Answers
D.
Query Stackdriver Monitoring Workspace.
D.
Query Stackdriver Monitoring Workspace.
Answers
Suggested answer: B

Explanation:

Admin activity logs are always created to log entries for API calls or other actions that modify the configuration or metadata of resources. For example, these logs record when users create VM instances or change Identity and Access Management permissions.

You have an application where the frontend is deployed on a managed instance group in subnet A and the data layer is stored on a mysql Compute Engine virtual machine (VM) in subnet B on the same VPC. Subnet A and Subnet B hold several other Compute Engine VMs. You only want to allow thee application frontend to access the data in the application's mysql instance on port 3306.

What should you do?

A.
Configure an ingress firewall rule that allows communication from the src IP range of subnet A to the tag 'data-tag' that is applied to the mysql Compute Engine VM on port 3306.
A.
Configure an ingress firewall rule that allows communication from the src IP range of subnet A to the tag 'data-tag' that is applied to the mysql Compute Engine VM on port 3306.
Answers
B.
Configure an ingress firewall rule that allows communication from the frontend's unique service account to the unique service account of the mysql Compute Engine VM on port 3306.
B.
Configure an ingress firewall rule that allows communication from the frontend's unique service account to the unique service account of the mysql Compute Engine VM on port 3306.
Answers
C.
Configure a network tag 'fe-tag' to be applied to all instances in subnet A and a network tag 'data-tag' to be applied to all instances in subnet B. Then configure an egress firewall rule that allows communication from Compute Engine VMs tagged with data-tag to destination Compute Engine VMs tagged fe-tag.
C.
Configure a network tag 'fe-tag' to be applied to all instances in subnet A and a network tag 'data-tag' to be applied to all instances in subnet B. Then configure an egress firewall rule that allows communication from Compute Engine VMs tagged with data-tag to destination Compute Engine VMs tagged fe-tag.
Answers
D.
Configure a network tag 'fe-tag' to be applied to all instances in subnet A and a network tag 'data-tag' to be applied to all instances in subnet B. Then configure an ingress firewall rule that allows communication from Compute Engine VMs tagged with fe-tag to destination Compute Engine VMs tagged with data-tag.
D.
Configure a network tag 'fe-tag' to be applied to all instances in subnet A and a network tag 'data-tag' to be applied to all instances in subnet B. Then configure an ingress firewall rule that allows communication from Compute Engine VMs tagged with fe-tag to destination Compute Engine VMs tagged with data-tag.
Answers
Suggested answer: B

Explanation:

https://cloud.google.com/sql/docs/mysql/sql-proxy#using-a-service-account

Your company operates an application instance group that is currently deployed behind a Google Cloud load balancer in us-central-1 and is configured to use the Standard Tier network. The infrastructure team wants to expand to a second Google Cloud region, us-east-2. You need to set up a single external IP address to distribute new requests to the instance groups in both regions.

What should you do?

A.
Change the load balancer backend configuration to use network endpoint groups instead of instance groups.
A.
Change the load balancer backend configuration to use network endpoint groups instead of instance groups.
Answers
B.
Change the load balancer frontend configuration to use the Premium Tier network, and add the new instance group.
B.
Change the load balancer frontend configuration to use the Premium Tier network, and add the new instance group.
Answers
C.
Create a new load balancer in us-east-2 using the Standard Tier network, and assign a static external IP address.
C.
Create a new load balancer in us-east-2 using the Standard Tier network, and assign a static external IP address.
Answers
D.
Create a Cloud VPN connection between the two regions, and enable Google Private Access.
D.
Create a Cloud VPN connection between the two regions, and enable Google Private Access.
Answers
Suggested answer: B

Explanation:

https://cloud.google.com/load-balancing/docs/choosing-load-balancer#global-regional

You are the security admin of your company. You have 3,000 objects in your Cloud Storage bucket. You do not want to manage access to each object individually. You also do not want the uploader of an object to always have full control of the object. However, you want to use Cloud Audit Logs to manage access to your bucket.

What should you do?

A.
Set up an ACL with OWNER permission to a scope of allUsers.
A.
Set up an ACL with OWNER permission to a scope of allUsers.
Answers
B.
Set up an ACL with READER permission to a scope of allUsers.
B.
Set up an ACL with READER permission to a scope of allUsers.
Answers
C.
Set up a default bucket ACL and manage access for users using IAM.
C.
Set up a default bucket ACL and manage access for users using IAM.
Answers
D.
Set up Uniform bucket-level access on the Cloud Storage bucket and manage access for users using IAM.
D.
Set up Uniform bucket-level access on the Cloud Storage bucket and manage access for users using IAM.
Answers
Suggested answer: D

Explanation:

https://cloud.google.com/storage/docs/uniform-bucket-level-access#enabled

You are the security admin of your company. Your development team creates multiple GCP projects under the 'implementation' folder for several dev, staging, and production workloads. You want to prevent data exfiltration by malicious insiders or compromised code by setting up a security perimeter. However, you do not want to restrict communication between the projects.

What should you do?

A.
Use a Shared VPC to enable communication between all projects, and use firewall rules to prevent data exfiltration.
A.
Use a Shared VPC to enable communication between all projects, and use firewall rules to prevent data exfiltration.
Answers
B.
Create access levels in Access Context Manager to prevent data exfiltration, and use a shared VPC for communication between projects.
B.
Create access levels in Access Context Manager to prevent data exfiltration, and use a shared VPC for communication between projects.
Answers
C.
Use an infrastructure-as-code software tool to set up a single service perimeter and to deploy a Cloud Function that monitors the 'implementation' folder via Stackdriver and Cloud Pub/Sub. When the function notices that a new project is added to the folder, it executes Terraform to add the new project to the associated perimeter.
C.
Use an infrastructure-as-code software tool to set up a single service perimeter and to deploy a Cloud Function that monitors the 'implementation' folder via Stackdriver and Cloud Pub/Sub. When the function notices that a new project is added to the folder, it executes Terraform to add the new project to the associated perimeter.
Answers
D.
Use an infrastructure-as-code software tool to set up three different service perimeters for dev, staging, and prod and to deploy a Cloud Function that monitors the 'implementation' folder via Stackdriver and Cloud Pub/Sub. When the function notices that a new project is added to the folder, it executes Terraform to add the new project to the respective perimeter.
D.
Use an infrastructure-as-code software tool to set up three different service perimeters for dev, staging, and prod and to deploy a Cloud Function that monitors the 'implementation' folder via Stackdriver and Cloud Pub/Sub. When the function notices that a new project is added to the folder, it executes Terraform to add the new project to the respective perimeter.
Answers
Suggested answer: C

Explanation:

https://cloud.google.com/vpc-service-controls/docs/overview#benefits

https://github.com/terraform-google-modules/terraform-google-vpc-service-controls/tree/master/examples/automatic_folder

Total 235 questions
Go to page: of 24