ExamGecko
Home Home / Google / Professional Cloud Architect

Google Professional Cloud Architect Practice Test - Questions Answers, Page 23

Question list
Search
Search

List of questions

Search

Related questions











You want to allow your operations learn to store togs from all the production protects in your Organization, without during logs from other projects All of the production projects are contained in a folder. You want to ensure that all logs for existing and new production projects are captured automatically. What should you do?

A.
Create an aggregated export on the Production folder. Set the log sink to be a Cloud Storage bucket in an operations project
A.
Create an aggregated export on the Production folder. Set the log sink to be a Cloud Storage bucket in an operations project
Answers
B.
Create an aggregated export on the Organization resource. Set the tog sink to be a Cloud Storage bucket in an operations project.
B.
Create an aggregated export on the Organization resource. Set the tog sink to be a Cloud Storage bucket in an operations project.
Answers
C.
Create log exports in the production projects. Set the log sinks to be a Cloud Storage bucket in an operations project.
C.
Create log exports in the production projects. Set the log sinks to be a Cloud Storage bucket in an operations project.
Answers
D.
Create tog exports in the production projects. Set the tog sinks to be BigQuery datasets in the production projects and grant IAM access to the operations team to run queries on the datasets
D.
Create tog exports in the production projects. Set the tog sinks to be BigQuery datasets in the production projects and grant IAM access to the operations team to run queries on the datasets
Answers
Suggested answer: B

You are configuring the cloud network architecture for a newly created project m Google Cloud that will host applications in Compote Engine Compute Engine virtual machine instances will be created in two different subnets (sub-a and sub-b) within a single region

• Instances in sub-a win have public IP addresses

• Instances in sub-b will have only private IP addresses

To download updated packages, instances must connect to a public repository outside the boundaries of Google Cloud You need to allow sub-b to access the external repository. What should you do?

A.
Enable Private Google Access on sub-b
A.
Enable Private Google Access on sub-b
Answers
B.
Configure Cloud NAT and select sub b m the NAT mapping section
B.
Configure Cloud NAT and select sub b m the NAT mapping section
Answers
C.
Configure a bastion host instance in sub a to connect to instances in sub-b
C.
Configure a bastion host instance in sub a to connect to instances in sub-b
Answers
D.
Enable Identity Aware Proxy for TCP forwarding for instances in sub-b
D.
Enable Identity Aware Proxy for TCP forwarding for instances in sub-b
Answers
Suggested answer: B

Your company has just recently activated Cloud Identity to manage users. The Google Cloud Organization has been configured as wed. The security learn needs to secure protects that will be part of the Organization. They want to prohibit IAM users outside the domain from gaining permissions from now on. What should they do?

A.
Configure an organization policy to restrict identities by domain
A.
Configure an organization policy to restrict identities by domain
Answers
B.
Configure an organization policy to block creation of service accounts
B.
Configure an organization policy to block creation of service accounts
Answers
C.
Configure Cloud Scheduler to trigger a Cloud Function every hour that removes all users that don't belong to the Cloud identity domain from all projects.
C.
Configure Cloud Scheduler to trigger a Cloud Function every hour that removes all users that don't belong to the Cloud identity domain from all projects.
Answers
Suggested answer: A

Your company has just recently activated Cloud Identity to manage users. The Google Cloud Organization has been configured as wed. The security learn needs to secure protects that will be part of the Organization. They want to prohibit IAM users outside the domain from gaining permissions from now on. What should they do?

A.
Configure an organization policy to restrict identities by domain
A.
Configure an organization policy to restrict identities by domain
Answers
B.
Configure an organization policy to block creation of service accounts
B.
Configure an organization policy to block creation of service accounts
Answers
C.
Configure Cloud Scheduler o trigger a Cloud Function every hour that removes all users that don't belong to the Cloud identity domain from all projects.
C.
Configure Cloud Scheduler o trigger a Cloud Function every hour that removes all users that don't belong to the Cloud identity domain from all projects.
Answers
D.
Create a technical user (e g . crawler@yourdomain com), and give it the protect owner rote at root organization level Write a bash script that• Lists all me IAM rules of all projects within the organization• Deletes all users that do not belong to the company domain Create a Compute Engine instance m a project within the Organization and configure gcloud to be executed with technical user credentials Configure a cron job that executes the bash script every hour.
D.
Create a technical user (e g . crawler@yourdomain com), and give it the protect owner rote at root organization level Write a bash script that• Lists all me IAM rules of all projects within the organization• Deletes all users that do not belong to the company domain Create a Compute Engine instance m a project within the Organization and configure gcloud to be executed with technical user credentials Configure a cron job that executes the bash script every hour.
Answers
Suggested answer: A

You want to store critical business information in Cloud Storage buckets. The information is regularly changed but previous versions need to be referenced on a regular basis. You want to ensure that there is a record of all changes to any information in these buckets. You want to ensure that accidental edits or deletions can be easily roiled back. Which feature should you enable?

A.
Bucket Lock
A.
Bucket Lock
Answers
B.
Object Versioning
B.
Object Versioning
Answers
C.
Object change notification
C.
Object change notification
Answers
D.
Object Lifecycle Management
D.
Object Lifecycle Management
Answers
Suggested answer: B

You are working with a data warehousing team that performs data analysis. The team needs to process data from external partners, but the data contains personally identifiable information (PlI).

You need to process and store the data without storing any of the Pll data. What should you do?

A.
Create a Dataflow pipeline to retrieve the data from the external sources. As part of the pipeline use the Cloud Data Loss Prevention (Cloud DLP) API to remove any Pll data Store the result in BigQuery
A.
Create a Dataflow pipeline to retrieve the data from the external sources. As part of the pipeline use the Cloud Data Loss Prevention (Cloud DLP) API to remove any Pll data Store the result in BigQuery
Answers
B.
Create a Dataflow pipeline to retrieve the data from the external sources. As part of the pipeline store all non-PII data in BigQuery and store all Pll data in a Cloud Storage bucket that has a retention policy set.
B.
Create a Dataflow pipeline to retrieve the data from the external sources. As part of the pipeline store all non-PII data in BigQuery and store all Pll data in a Cloud Storage bucket that has a retention policy set.
Answers
C.
Ask the external partners to upload an data on Cloud Storage Configure Bucket Lock for the bucket Create a Dataflow pipeline to read the data from the bucket As part of the pipeline, use the Cloud Data Loss Prevention (Cloud DIP) API to remove any Pll data Store the result in BigQuery
C.
Ask the external partners to upload an data on Cloud Storage Configure Bucket Lock for the bucket Create a Dataflow pipeline to read the data from the bucket As part of the pipeline, use the Cloud Data Loss Prevention (Cloud DIP) API to remove any Pll data Store the result in BigQuery
Answers
D.
Ask the external partners to import ail data in your BigQuery dataset Create a dataflow pipeline to copy the data into a new table As part of the Dataflow bucket skip all data in columns that have Pll data
D.
Ask the external partners to import ail data in your BigQuery dataset Create a dataflow pipeline to copy the data into a new table As part of the Dataflow bucket skip all data in columns that have Pll data
Answers
Suggested answer: A

Explanation:

Create a Dataflow pipeline to retrieve the data from the external sources, he did not specify the way he is going to create it, it might be a pub/sub or external table or whatever.


Your company wants to migrate their 10-TB on-premises database export into Cloud Storage You want to minimize the time it takes to complete this activity, the overall cost and database load The bandwidth between the on-premises environment and Google Cloud is 1 Gbps You want to follow Google-recommended practices What should you do?

A.
Use the Data Transfer appliance to perform an offline migration
A.
Use the Data Transfer appliance to perform an offline migration
Answers
B.
Use a commercial partner ETL solution to extract the data from the on-premises database and upload it into Cloud Storage
B.
Use a commercial partner ETL solution to extract the data from the on-premises database and upload it into Cloud Storage
Answers
C.
Develop a Dataflow job to read data directly from the database and write it into Cloud Storage
C.
Develop a Dataflow job to read data directly from the database and write it into Cloud Storage
Answers
D.
Compress the data and upload it with gsutii -m to enable multi-threaded copy
D.
Compress the data and upload it with gsutii -m to enable multi-threaded copy
Answers
Suggested answer: A

Explanation:

The Data Transfer appliance is a Google-provided hardware device that can be used to transfer large amounts of data from on-premises environments to Cloud Storage. It is suitable for scenarios where the bandwidth between the on-premises environment and Google Cloud is low or insufficient, and the data size is large. The Data Transfer appliance can minimize the time it takes to complete the migration, the overall cost and database load, by avoiding network bottlenecks and reducing bandwidth consumption. The Data Transfer appliance also encrypts the data at rest and in transit, ensuring data security and privacy. The other options are not optimal for this scenario, because they either require a high-bandwidth network connection (B, C, D), or incur additional costs and complexity (B, C).

Reference:

https://cloud.google.com/data-transfer-appliance/docs/overview

https://cloud.google.com/blog/products/storage-data-transfer/introducing-storage-transfer-service-for-on-premises-data

You are responsible for the Google Cloud environment in your company Multiple departments need access to their own projects and the members within each department will have the same project responsibilities You want to structure your Google Cloud environment for minimal maintenance and maximum overview of 1AM permissions as each department's projects start and end You want to follow Google-recommended practices What should you do?

A.
Create a Google Group per department and add all department members to their respective groups Create a folder per department and grant the respective group the required 1AM permissions at the folder level Add the projects under the respective folders
A.
Create a Google Group per department and add all department members to their respective groups Create a folder per department and grant the respective group the required 1AM permissions at the folder level Add the projects under the respective folders
Answers
B.
Grant all department members the required 1AM permissions for their respective projects
B.
Grant all department members the required 1AM permissions for their respective projects
Answers
C.
Create a Google Group per department and add all department members to their respective groups Grant each group the required I AM permissions for their respective projects
C.
Create a Google Group per department and add all department members to their respective groups Grant each group the required I AM permissions for their respective projects
Answers
D.
Create a folder per department and grant the respective members of the department the required 1AM permissions at the folder level. Structure all projects for each department under the respective folders
D.
Create a folder per department and grant the respective members of the department the required 1AM permissions at the folder level. Structure all projects for each department under the respective folders
Answers
Suggested answer: A

Explanation:

This option follows the Google-recommended practices for structuring a Google Cloud environment for minimal maintenance and maximum overview of IAM permissions. By creating a Google Group per department and adding all department members to their respective groups, you can simplify user management and avoid granting IAM permissions to individual users. By creating a folder per department and granting the respective group the required IAM permissions at the folder level, you can enforce consistent policies across all projects within each department and avoid granting IAM permissions at the project level. By adding the projects under the respective folders, you can organize your resources hierarchically and leverage inheritance of IAM policies from folders to projects. The other options are not optimal for this scenario, because they either require granting IAM permissions to individual users (B, C), or do not use Google Groups to manage users (D).

Reference:

https://cloud.google.com/architecture/framework/system-design

https://cloud.google.com/architecture/identity/best-practices-for-planning

https://cloud.google.com/resource-manager/docs/creating-managing-folders

You are managing several projects on Google Cloud and need to interact on a daily basis with BigQuery, Bigtable and Kubernetes Engine using the gcloud CLI tool You are travelling a lot and work on different workstations during the week You want to avoid having to manage the gcloud CLI manually What should you do?

A.
Use a package manager to install gcloud on your workstations instead of installing it manually
A.
Use a package manager to install gcloud on your workstations instead of installing it manually
Answers
B.
Create a Compute Engine instance and install gcloud on the instance Connect to this instance via SSH to always use the same gcloud installation when interacting with Google Cloud
B.
Create a Compute Engine instance and install gcloud on the instance Connect to this instance via SSH to always use the same gcloud installation when interacting with Google Cloud
Answers
C.
Install gcloud on all of your workstations Run the command gcloud components auto-update on each workstation
C.
Install gcloud on all of your workstations Run the command gcloud components auto-update on each workstation
Answers
D.
Use Google Cloud Shell in the Google Cloud Console to interact with Google Cloud
D.
Use Google Cloud Shell in the Google Cloud Console to interact with Google Cloud
Answers
Suggested answer: D

Explanation:

This option allows you to use the gcloud CLI tool without having to install or manage it manually on different workstations. Google Cloud Shell is a browser-based command-line tool that provides you with a temporary Compute Engine virtual machine instance preloaded with the Cloud SDK, including the gcloud CLI tool. You can access Google Cloud Shell from any web browser and use it to interact with BigQuery, Bigtable and Kubernetes Engine using the gcloud CLI tool. The other options are not optimal for this scenario, because they either require installing and updating the gcloud CLI tool on multiple workstations (A, C), or creating and maintaining a Compute Engine instance for the sole purpose of using the gcloud CLI tool (B).

Reference:

https://cloud.google.com/shell/docs/overview

https://cloud.google.com/sdk/gcloud/

The operations team in your company wants to save Cloud VPN log events (or one year You need to configure the cloud infrastructure to save the logs What should you do?

A.
Set up a filter in Cloud Logging and a topic in Pub/Sub to publish the logs
A.
Set up a filter in Cloud Logging and a topic in Pub/Sub to publish the logs
Answers
B.
Set up a Cloud Logging Dashboard titled Cloud VPN Logs, and then add a chart that queries for the VPN metrics over a one-year time period
B.
Set up a Cloud Logging Dashboard titled Cloud VPN Logs, and then add a chart that queries for the VPN metrics over a one-year time period
Answers
C.
Enable the Compute Engine API and then enable logging on the firewall rules that match the traffic you want to save
C.
Enable the Compute Engine API and then enable logging on the firewall rules that match the traffic you want to save
Answers
D.
Set up a filter in Cloud Logging and a Cloud Storage bucket as an export target for the logs you want to save
D.
Set up a filter in Cloud Logging and a Cloud Storage bucket as an export target for the logs you want to save
Answers
Suggested answer: D
Total 285 questions
Go to page: of 29