ExamGecko
Home Home / Google / Professional Cloud Security Engineer

Google Professional Cloud Security Engineer Practice Test - Questions Answers, Page 24

Question list
Search
Search

Related questions











Your company's users access data in a BigQuery table. You want to ensure they can only access the data during working hours.

What should you do?

A.
Assign a BigQuery Data Viewer role along with an IAM condition that limits the access to specified working hours.
A.
Assign a BigQuery Data Viewer role along with an IAM condition that limits the access to specified working hours.
Answers
B.
Configure Cloud Scheduler so that it triggers a Cloud Functions instance that modifies the organizational policy constraints for BigQuery during the specified working hours.
B.
Configure Cloud Scheduler so that it triggers a Cloud Functions instance that modifies the organizational policy constraints for BigQuery during the specified working hours.
Answers
C.
Assign a BigQuery Data Viewer role to a service account that adds and removes the users daily during the specified working hours
C.
Assign a BigQuery Data Viewer role to a service account that adds and removes the users daily during the specified working hours
Answers
D.
Run a gsuttl script that assigns a BigQuery Data Viewer role, and remove it only during the specified working hours.
D.
Run a gsuttl script that assigns a BigQuery Data Viewer role, and remove it only during the specified working hours.
Answers
Suggested answer: A

Your organization wants to be compliant with the General Data Protection Regulation (GDPR) on Google Cloud You must implement data residency and operational sovereignty in the EU.

What should you do?

Choose 2 answers

A.
Limit the physical location of a new resource with the Organization Policy Service resource locations constraint.'
A.
Limit the physical location of a new resource with the Organization Policy Service resource locations constraint.'
Answers
B.
Use Cloud IDS to get east-west and north-south traffic visibility in the EU to monitor intra-VPC and mter-VPC communication.
B.
Use Cloud IDS to get east-west and north-south traffic visibility in the EU to monitor intra-VPC and mter-VPC communication.
Answers
C.
Limit Google personnel access based on predefined attributes such as their citizenship or geographic location by using Key Access Justifications
C.
Limit Google personnel access based on predefined attributes such as their citizenship or geographic location by using Key Access Justifications
Answers
D.
Use identity federation to limit access to Google Cloud resources from non-EU entities.
D.
Use identity federation to limit access to Google Cloud resources from non-EU entities.
Answers
E.
Use VPC Flow Logs to monitor intra-VPC and inter-VPC traffic in the EU.
E.
Use VPC Flow Logs to monitor intra-VPC and inter-VPC traffic in the EU.
Answers
Suggested answer: A, C

Explanation:

https://cloud.google.com/architecture/framework/security/data-residency-sovereignty#manage_your_operational_sovereignty

You manage a mission-critical workload for your organization, which is in a highly regulated industry The workload uses Compute Engine VMs to analyze and process the sensitive data after it is uploaded to Cloud Storage from the endpomt computers. Your compliance team has detected that this workload does not meet the data protection requirements for sensitive data. You need to meet these requirements;

* Manage the data encryption key (DEK) outside the Google Cloud boundary.

* Maintain full control of encryption keys through a third-party provider.

* Encrypt the sensitive data before uploading it to Cloud Storage

* Decrypt the sensitive data during processing in the Compute Engine VMs

* Encrypt the sensitive data in memory while in use in the Compute Engine VMs

What should you do?

Choose 2 answers

A.
Create a VPC Service Controls service perimeter across your existing Compute Engine VMs and Cloud Storage buckets
A.
Create a VPC Service Controls service perimeter across your existing Compute Engine VMs and Cloud Storage buckets
Answers
B.
Migrate the Compute Engine VMs to Confidential VMs to access the sensitive data.
B.
Migrate the Compute Engine VMs to Confidential VMs to access the sensitive data.
Answers
C.
Configure Cloud External Key Manager to encrypt the sensitive data before it is uploaded to Cloud Storage and decrypt the sensitive data after it is downloaded into your VMs
C.
Configure Cloud External Key Manager to encrypt the sensitive data before it is uploaded to Cloud Storage and decrypt the sensitive data after it is downloaded into your VMs
Answers
D.
Create Confidential VMs to access the sensitive data.
D.
Create Confidential VMs to access the sensitive data.
Answers
E.
Configure Customer Managed Encryption Keys to encrypt the sensitive data before it is uploaded to Cloud Storage, and decrypt the sensitive data after it is downloaded into your VMs.
E.
Configure Customer Managed Encryption Keys to encrypt the sensitive data before it is uploaded to Cloud Storage, and decrypt the sensitive data after it is downloaded into your VMs.
Answers
Suggested answer: C, D

Explanation:

https://cloud.google.com/confidential-computing/confidential-vm/docs/creating-cvm-instance#considerations

Confidential VM does not support live migration. You can only enable Confidential Computing on a VM when you first create the instance. https://cloud.google.com/confidential-computing/confidential-vm/docs/creating-cvm-instance


Last week, a company deployed a new App Engine application that writes logs to BigQuery. No other workloads are running in the project. You need to validate that all data written to BigQuery was done using the App Engine Default Service Account.

What should you do?

A.
1. Use StackDriver Logging and filter on BigQuery Insert Jobs. 2. Click on the email address in line with the App Engine Default Service Account in the authentication field. 3. Click Hide Matching Entries. 4. Make sure the resulting list is empty.
A.
1. Use StackDriver Logging and filter on BigQuery Insert Jobs. 2. Click on the email address in line with the App Engine Default Service Account in the authentication field. 3. Click Hide Matching Entries. 4. Make sure the resulting list is empty.
Answers
B.
1. Use StackDriver Logging and filter on BigQuery Insert Jobs. 2. Click on the email address in line with the App Engine Default Service Account in the authentication field. 3. Click Show Matching Entries. 4. Make sure the resulting list is empty.
B.
1. Use StackDriver Logging and filter on BigQuery Insert Jobs. 2. Click on the email address in line with the App Engine Default Service Account in the authentication field. 3. Click Show Matching Entries. 4. Make sure the resulting list is empty.
Answers
C.
1. In BigQuery, select the related dataset. 2. Make sure the App Engine Default Service Account is the only account that can write to the dataset.
C.
1. In BigQuery, select the related dataset. 2. Make sure the App Engine Default Service Account is the only account that can write to the dataset.
Answers
D.
1. Go to the IAM section on the project. 2. Validate that the App Engine Default Service Account is the only account that has a role that can write to BigQuery.
D.
1. Go to the IAM section on the project. 2. Validate that the App Engine Default Service Account is the only account that has a role that can write to BigQuery.
Answers
Suggested answer: A

You are backing up application logs to a shared Cloud Storage bucket that is accessible to both the administrator and analysts. Analysts should not have access to logs that contain any personally identifiable information (PII). Log files containing PII should be stored in another bucket that is only accessible to the administrator. What should you do?

A.
Upload the logs to both the shared bucket and the bucket with Pll that is only accessible to the administrator. Use the Cloud Data Loss Prevention API to create a job trigger. Configure the trigger to delete any files that contain Pll from the shared bucket.
A.
Upload the logs to both the shared bucket and the bucket with Pll that is only accessible to the administrator. Use the Cloud Data Loss Prevention API to create a job trigger. Configure the trigger to delete any files that contain Pll from the shared bucket.
Answers
B.
On the shared bucket, configure Object Lifecycle Management to delete objects that contain Pll.
B.
On the shared bucket, configure Object Lifecycle Management to delete objects that contain Pll.
Answers
C.
On the shared bucket, configure a Cloud Storage trigger that is only triggered when Pll is uploaded. Use Cloud Functions to capture the trigger and delete the files that contain Pll.
C.
On the shared bucket, configure a Cloud Storage trigger that is only triggered when Pll is uploaded. Use Cloud Functions to capture the trigger and delete the files that contain Pll.
Answers
D.
Use Pub/Sub and Cloud Functions to trigger a Cloud Data Loss Prevention scan every time a file is uploaded to the administrator's bucket. If the scan does not detect Pll, have the function move the objects into the shared Cloud Storage bucket.
D.
Use Pub/Sub and Cloud Functions to trigger a Cloud Data Loss Prevention scan every time a file is uploaded to the administrator's bucket. If the scan does not detect Pll, have the function move the objects into the shared Cloud Storage bucket.
Answers
Suggested answer: B
Total 235 questions
Go to page: of 24