ExamGecko
Home Home / Google / Professional Cloud Security Engineer

Google Professional Cloud Security Engineer Practice Test - Questions Answers, Page 14

Question list
Search
Search

List of questions

Search

Related questions











You perform a security assessment on a customer architecture and discover that multiple VMs have public IP addresses. After providing a recommendation to remove the public IP addresses, you are told those VMs need to communicate to external sites as part of the customer's typical operations. What should you recommend to reduce the need for public IP addresses in your customer's VMs?

A.
Google Cloud Armor
A.
Google Cloud Armor
Answers
B.
Cloud NAT
B.
Cloud NAT
Answers
C.
Cloud Router
C.
Cloud Router
Answers
D.
Cloud VPN
D.
Cloud VPN
Answers
Suggested answer: B

Explanation:

https://cloud.google.com/nat/docs/overview

You are tasked with exporting and auditing security logs for login activity events for Google Cloud console and API calls that modify configurations to Google Cloud resources. Your export must meet the following requirements:

Export related logs for all projects in the Google Cloud organization.

Export logs in near real-time to an external SIEM.

What should you do? (Choose two.)

A.
Create a Log Sink at the organization level with a Pub/Sub destination.
A.
Create a Log Sink at the organization level with a Pub/Sub destination.
Answers
B.
Create a Log Sink at the organization level with the includeChildren parameter, and set the destination to a Pub/Sub topic.
B.
Create a Log Sink at the organization level with the includeChildren parameter, and set the destination to a Pub/Sub topic.
Answers
C.
Enable Data Access audit logs at the organization level to apply to all projects.
C.
Enable Data Access audit logs at the organization level to apply to all projects.
Answers
D.
Enable Google Workspace audit logs to be shared with Google Cloud in the Admin Console.
D.
Enable Google Workspace audit logs to be shared with Google Cloud in the Admin Console.
Answers
E.
Ensure that the SIEM processes the AuthenticationInfo field in the audit log entry to gather identity information.
E.
Ensure that the SIEM processes the AuthenticationInfo field in the audit log entry to gather identity information.
Answers
Suggested answer: B, D

Explanation:

'Google Workspace Login Audit: Login Audit logs track user sign-ins to your domain. These logs only record the login event. They don't record which system was used to perform the login action.' https://cloud.google.com/logging/docs/audit/gsuite-audit-logging#services

Your company's Chief Information Security Officer (CISO) creates a requirement that business data must be stored in specific locations due to regulatory requirements that affect the company's global expansion plans. After working on the details to implement this requirement, you determine the following:

The services in scope are included in the Google Cloud Data Residency Terms.

The business data remains within specific locations under the same organization.

The folder structure can contain multiple data residency locations.

You plan to use the Resource Location Restriction organization policy constraint. At which level in the resource hierarchy should you set the constraint?

A.
Folder
A.
Folder
Answers
B.
Resource
B.
Resource
Answers
C.
Project
C.
Project
Answers
D.
Organization
D.
Organization
Answers
Suggested answer: C

Explanation:

https://cloud.google.com/resource-manager/docs/organization-policy/defining-locations

You need to set up a Cloud interconnect connection between your company's on-premises data center and VPC host network. You want to make sure that on-premises applications can only access Google APIs over the Cloud Interconnect and not through the public internet. You are required to only use APIs that are supported by VPC Service Controls to mitigate against exfiltration risk to non-supported APIs. How should you configure the network?

A.
Enable Private Google Access on the regional subnets and global dynamic routing mode.
A.
Enable Private Google Access on the regional subnets and global dynamic routing mode.
Answers
B.
Set up a Private Service Connect endpoint IP address with the API bundle of 'all-apis', which is advertised as a route over the Cloud interconnect connection.
B.
Set up a Private Service Connect endpoint IP address with the API bundle of 'all-apis', which is advertised as a route over the Cloud interconnect connection.
Answers
C.
Use private.googleapis.com to access Google APIs using a set of IP addresses only routable from within Google Cloud, which are advertised as routes over the connection.
C.
Use private.googleapis.com to access Google APIs using a set of IP addresses only routable from within Google Cloud, which are advertised as routes over the connection.
Answers
D.
Use restricted googleapis.com to access Google APIs using a set of IP addresses only routable from within Google Cloud, which are advertised as routes over the Cloud Interconnect connection.
D.
Use restricted googleapis.com to access Google APIs using a set of IP addresses only routable from within Google Cloud, which are advertised as routes over the Cloud Interconnect connection.
Answers
Suggested answer: D

Explanation:

https://cloud.google.com/vpc/docs/private-service-connect

An API bundle:

All APIs (all-apis): most Google APIs

(same as private.googleapis.com).

VPC-SC (vpc-sc): APIs that VPC Service Controls supports

(same as restricted.googleapis.com).

VMs in the same VPC network as the endpoint (all regions)

On-premises systems that are connected to the VPC network that contains the endpoint

You need to implement an encryption-at-rest strategy that protects sensitive data and reduces key management complexity for non-sensitive data. Your solution has the following requirements:

Schedule key rotation for sensitive data.

Control which region the encryption keys for sensitive data are stored in.

Minimize the latency to access encryption keys for both sensitive and non-sensitive data.

What should you do?

A.
Encrypt non-sensitive data and sensitive data with Cloud External Key Manager.
A.
Encrypt non-sensitive data and sensitive data with Cloud External Key Manager.
Answers
B.
Encrypt non-sensitive data and sensitive data with Cloud Key Management Service.
B.
Encrypt non-sensitive data and sensitive data with Cloud Key Management Service.
Answers
C.
Encrypt non-sensitive data with Google default encryption, and encrypt sensitive data with Cloud External Key Manager.
C.
Encrypt non-sensitive data with Google default encryption, and encrypt sensitive data with Cloud External Key Manager.
Answers
D.
Encrypt non-sensitive data with Google default encryption, and encrypt sensitive data with Cloud Key Management Service.
D.
Encrypt non-sensitive data with Google default encryption, and encrypt sensitive data with Cloud Key Management Service.
Answers
Suggested answer: D

Explanation:

Google uses a common cryptographic library, Tink, which incorporates our FIPS 140-2 Level 1 validated module, BoringCrypto, to implement encryption consistently across almost all Google Cloud products. To provideflexibility of controlling the key residency and rotation schedule, use google provided key for non-sensitive and encrypt sensitive data with Cloud Key Management Service

Your security team uses encryption keys to ensure confidentiality of user data. You want to establish a process to reduce the impact of a potentially compromised symmetric encryption key in Cloud Key Management Service (Cloud KMS).

Which steps should your team take before an incident occurs? (Choose two.)

A.
Disable and revoke access to compromised keys.
A.
Disable and revoke access to compromised keys.
Answers
B.
Enable automatic key version rotation on a regular schedule.
B.
Enable automatic key version rotation on a regular schedule.
Answers
C.
Manually rotate key versions on an ad hoc schedule.
C.
Manually rotate key versions on an ad hoc schedule.
Answers
D.
Limit the number of messages encrypted with each key version.
D.
Limit the number of messages encrypted with each key version.
Answers
E.
Disable the Cloud KMS API.
E.
Disable the Cloud KMS API.
Answers
Suggested answer: B, D

Explanation:

As per document 'Limiting the number of messages encrypted with the same key version helps prevent attacks enabled by cryptanalysis.' https://cloud.google.com/kms/docs/key-rotation

Your company's chief information security officer (CISO) is requiring business data to be stored in specific locations due to regulatory requirements that affect the company's global expansion plans. After working on a plan to implement this requirement, you determine the following:

The services in scope are included in the Google Cloud data residency requirements.

The business data remains within specific locations under the same organization.

The folder structure can contain multiple data residency locations.

The projects are aligned to specific locations.

You plan to use the Resource Location Restriction organization policy constraint with very granular control. At which level in the hierarchy should you set the constraint?

A.
Organization
A.
Organization
Answers
B.
Resource
B.
Resource
Answers
C.
Project
C.
Project
Answers
D.
Folder
D.
Folder
Answers
Suggested answer: C

A database administrator notices malicious activities within their Cloud SQL instance. The database administrator wants to monitor the API calls that read the configuration or metadata of resources. Which logs should the database administrator review?

A.
Admin Activity
A.
Admin Activity
Answers
B.
System Event
B.
System Event
Answers
C.
Access Transparency
C.
Access Transparency
Answers
D.
Data Access
D.
Data Access
Answers
Suggested answer: D

Explanation:

https://cloud.google.com/logging/docs/audit/#data-access 'Data Access audit logs contain API calls that read the configuration or metadata of resources, as well as user-driven API calls that create, modify, or read user-provided resource data.'

You are backing up application logs to a shared Cloud Storage bucket that is accessible to both the administrator and analysts. Analysts should not have access to logs that contain any personally identifiable information (PII). Log files containing PII should be stored in another bucket that is only accessible to the administrator. What should you do?

A.
Upload the logs to both the shared bucket and the bucket with Pll that is only accessible to the administrator. Use the Cloud Data Loss Prevention API to create a job trigger. Configure the trigger to delete any files that contain Pll from the shared bucket.
A.
Upload the logs to both the shared bucket and the bucket with Pll that is only accessible to the administrator. Use the Cloud Data Loss Prevention API to create a job trigger. Configure the trigger to delete any files that contain Pll from the shared bucket.
Answers
B.
On the shared bucket, configure Object Lifecycle Management to delete objects that contain Pll.
B.
On the shared bucket, configure Object Lifecycle Management to delete objects that contain Pll.
Answers
C.
On the shared bucket, configure a Cloud Storage trigger that is only triggered when Pll is uploaded. Use Cloud Functions to capture the trigger and delete the files that contain Pll.
C.
On the shared bucket, configure a Cloud Storage trigger that is only triggered when Pll is uploaded. Use Cloud Functions to capture the trigger and delete the files that contain Pll.
Answers
D.
Use Pub/Sub and Cloud Functions to trigger a Cloud Data Loss Prevention scan every time a file is uploaded to the administrator's bucket. If the scan does not detect Pll, have the function move the objects into the shared Cloud Storage bucket.
D.
Use Pub/Sub and Cloud Functions to trigger a Cloud Data Loss Prevention scan every time a file is uploaded to the administrator's bucket. If the scan does not detect Pll, have the function move the objects into the shared Cloud Storage bucket.
Answers
Suggested answer: B

You work for an organization in a regulated industry that has strict data protection requirements. The organization backs up their data in the cloud. To comply with data privacy regulations, this data can only be stored for a specific length of time and must be deleted after this specific period.

You want to automate the compliance with this regulation while minimizing storage costs. What should you do?

A.
Store the data in a persistent disk, and delete the disk at expiration time.
A.
Store the data in a persistent disk, and delete the disk at expiration time.
Answers
B.
Store the data in a Cloud Bigtable table, and set an expiration time on the column families.
B.
Store the data in a Cloud Bigtable table, and set an expiration time on the column families.
Answers
C.
Store the data in a BigQuery table, and set the table's expiration time.
C.
Store the data in a BigQuery table, and set the table's expiration time.
Answers
D.
Store the data in a Cloud Storage bucket, and configure the bucket's Object Lifecycle Management feature.
D.
Store the data in a Cloud Storage bucket, and configure the bucket's Object Lifecycle Management feature.
Answers
Suggested answer: D

Explanation:

To miminize costs, it's always GCS even though BQ comes as a close 2nd. But, since the question did not specify what kind of data it is (raw files vs tabular data), it is safe to assume GCS is the preferred option with LifeCycle enablement.

Total 235 questions
Go to page: of 24