ExamGecko
Home Home / Google / Professional Cloud Architect

Google Professional Cloud Architect Practice Test - Questions Answers, Page 22

Question list
Search
Search

List of questions

Search

Related questions











You want to allow your operations team to store logs from all the production projects in your Organization, without including logs from other projects. All of the production projects are contained in a folder. You want to ensure that all logs for existing and new production projects are captured automatically. What should you do?

A.
Create an aggregated export on the Production folder. Set the log sink to be a Cloud Storage bucket in an operations project.
A.
Create an aggregated export on the Production folder. Set the log sink to be a Cloud Storage bucket in an operations project.
Answers
B.
Create an aggregated export on the Organization resource. Set the log sink to be a Cloud Storage bucket in an operations project.
B.
Create an aggregated export on the Organization resource. Set the log sink to be a Cloud Storage bucket in an operations project.
Answers
C.
Create log exports in the production projects. Set the log sinks to be a Cloud Storage bucket in an operations project.
C.
Create log exports in the production projects. Set the log sinks to be a Cloud Storage bucket in an operations project.
Answers
D.
Create log exports in the production projects. Set the log sinks to be BigQuery datasets in the production projects, and grant IAM access to the operations team to run queries on the datasets.
D.
Create log exports in the production projects. Set the log sinks to be BigQuery datasets in the production projects, and grant IAM access to the operations team to run queries on the datasets.
Answers
Suggested answer: B

Explanation:

Reference: https://cloud.google.com/logging/docs/audit

Your company has an application that is running on multiple instances of Compute Engine. It generates 1 TB per day of logs.

For compliance reasons, the logs need to be kept for at least two years. The logs need to be available for active query for 30 days. After that, they just need to be retained for audit purposes. You want to implement a storage solution that is compliant, minimizes costs, and follows Google-recommended practices. What should you do?

A.
1. Install a Cloud Logging agent on all instances.
A.
1. Install a Cloud Logging agent on all instances.
Answers
B.
Create a sink to export logs into a regional Cloud Storage bucket.
B.
Create a sink to export logs into a regional Cloud Storage bucket.
Answers
C.
Create an Object Lifecycle rule to move files into a Coldline Cloud Storage bucket after one month.
C.
Create an Object Lifecycle rule to move files into a Coldline Cloud Storage bucket after one month.
Answers
D.
Configure a retention policy at the bucket level using bucket lock.
D.
Configure a retention policy at the bucket level using bucket lock.
Answers
E.
1. Write a daily cron job, running on all instances, that uploads logs into a Cloud Storage bucket.
E.
1. Write a daily cron job, running on all instances, that uploads logs into a Cloud Storage bucket.
Answers
F.
Create a sink to export logs into a regional Cloud Storage bucket.
F.
Create a sink to export logs into a regional Cloud Storage bucket.
Answers
G.
Create an Object Lifecycle rule to move files into a Coldline Cloud Storage bucket after one month.
G.
Create an Object Lifecycle rule to move files into a Coldline Cloud Storage bucket after one month.
Answers
H.
1. Install a Cloud Logging agent on all instances.
H.
1. Install a Cloud Logging agent on all instances.
Answers
I.
Create a sink to export logs into a partitioned BigQuery table.
I.
Create a sink to export logs into a partitioned BigQuery table.
Answers
J.
Set a time_partitioning_expiration of 30 days.
J.
Set a time_partitioning_expiration of 30 days.
Answers
K.
1. Create a daily cron job, running on all instances, that uploads logs into a partitioned BigQuery table.
K.
1. Create a daily cron job, running on all instances, that uploads logs into a partitioned BigQuery table.
Answers
L.
Set a time_partitioning_expiration of 30 days.
L.
Set a time_partitioning_expiration of 30 days.
Answers
Suggested answer: C

Your company has just recently activated Cloud Identity to manage users. The Google Cloud Organization has been configured as well. The security team needs to secure projects that will be part of the Organization. They want to prohibit IAM users outside the domain from gaining permissions from now on. What should they do?

A.
Configure an organization policy to restrict identities by domain.
A.
Configure an organization policy to restrict identities by domain.
Answers
B.
Configure an organization policy to block creation of service accounts.
B.
Configure an organization policy to block creation of service accounts.
Answers
C.
Configure Cloud Scheduler to trigger a Cloud Function every hour that removes all users that don't belong to the Cloud Identity domain from all projects.
C.
Configure Cloud Scheduler to trigger a Cloud Function every hour that removes all users that don't belong to the Cloud Identity domain from all projects.
Answers
D.
Create a technical user (e.g., [email protected]), and give it the project owner role at root organization level.Write a bash script that:- Lists all the IAM rules of all projects within the organization.- Deletes all users that do not belong to the company domain.Create a Compute Engine instance in a project within the Organization and configure gcloud to be executed with technical user credentials. Configure a cron job that executes the bash script every hour.
D.
Create a technical user (e.g., [email protected]), and give it the project owner role at root organization level.Write a bash script that:- Lists all the IAM rules of all projects within the organization.- Deletes all users that do not belong to the company domain.Create a Compute Engine instance in a project within the Organization and configure gcloud to be executed with technical user credentials. Configure a cron job that executes the bash script every hour.
Answers
Suggested answer: D

Explanation:

Reference: https://sysdig.com/blog/gcp-security-best-practices/

Your company has an application running on Google Cloud that is collecting data from thousands of physical devices that are globally distributed. Data is published to Pub/Sub and streamed in real time into an SSD Cloud Bigtable cluster via a Dataflow pipeline. The operations team informs you that your Cloud Bigtable cluster has a hotspot, and queries are taking longer than expected. You need to resolve the problem and prevent it from happening in the future. What should you do?

A.
Advise your clients to use HBase APIs instead of NodeJS APIs.
A.
Advise your clients to use HBase APIs instead of NodeJS APIs.
Answers
B.
Delete records older than 30 days.
B.
Delete records older than 30 days.
Answers
C.
Review your RowKey strategy and ensure that keys are evenly spread across the alphabet.
C.
Review your RowKey strategy and ensure that keys are evenly spread across the alphabet.
Answers
D.
Double the number of nodes you currently have.
D.
Double the number of nodes you currently have.
Answers
Suggested answer: C

Your company has a Google Cloud project that uses BigQuery for data warehousing. There are some tables that contain personally identifiable information (PII). Only the compliance team may access the PII. The other information in the tables must be available to the data science team. You want to minimize cost and the time it takes to assign appropriate access to the tables. What should you do?

A.
1. From the dataset where you have the source data, create views of tables that you want to share, excluding PII.
A.
1. From the dataset where you have the source data, create views of tables that you want to share, excluding PII.
Answers
B.
Assign an appropriate project-level IAM role to the members of the data science team.
B.
Assign an appropriate project-level IAM role to the members of the data science team.
Answers
C.
Assign access controls to the dataset that contains the view.
C.
Assign access controls to the dataset that contains the view.
Answers
D.
1. From the dataset where you have the source data, create materialized views of tables that you want to share, excluding PII.
D.
1. From the dataset where you have the source data, create materialized views of tables that you want to share, excluding PII.
Answers
E.
Assign an appropriate project-level IAM role to the members of the data science team.
E.
Assign an appropriate project-level IAM role to the members of the data science team.
Answers
F.
Assign access controls to the dataset that contains the view.
F.
Assign access controls to the dataset that contains the view.
Answers
G.
1. Create a dataset for the data science team.
G.
1. Create a dataset for the data science team.
Answers
H.
Create views of tables that you want to share, excluding PII.
H.
Create views of tables that you want to share, excluding PII.
Answers
I.
Assign an appropriate project-level IAM role to the members of the data science team.
I.
Assign an appropriate project-level IAM role to the members of the data science team.
Answers
J.
Assign access controls to the dataset that contains the view.
J.
Assign access controls to the dataset that contains the view.
Answers
K.
Authorize the view to access the source dataset.
K.
Authorize the view to access the source dataset.
Answers
L.
1. Create a dataset for the data science team.
L.
1. Create a dataset for the data science team.
Answers
M.
Create materialized views of tables that you want to share, excluding PII.
M.
Create materialized views of tables that you want to share, excluding PII.
Answers
N.
Assign an appropriate project-level IAM role to the members of the data science team.
N.
Assign an appropriate project-level IAM role to the members of the data science team.
Answers
O.
Assign access controls to the dataset that contains the view.
O.
Assign access controls to the dataset that contains the view.
Answers
P.
Authorize the view to access the source dataset.
P.
Authorize the view to access the source dataset.
Answers
Suggested answer: C

Explanation:

Reference: https://cloud.google.com/blog/topics/developers-practitioners/bigquery-admin-reference-guide-data-governance?skip_cache=true

Your operations team currently stores 10 TB of data in an object storage service from a third-party provider. They want to move this data to a Cloud Storage bucket as quickly as possible, following Googlerecommended practices. They want to minimize the cost of this data migration. Which approach should they use?

A.
Use the gsutil mv command to move the data.
A.
Use the gsutil mv command to move the data.
Answers
B.
Use the Storage Transfer Service to move the data.
B.
Use the Storage Transfer Service to move the data.
Answers
C.
Download the data to a Transfer Appliance, and ship it to Google.
C.
Download the data to a Transfer Appliance, and ship it to Google.
Answers
D.
Download the data to the on-premises data center, and upload it to the Cloud Storage bucket.
D.
Download the data to the on-premises data center, and upload it to the Cloud Storage bucket.
Answers
Suggested answer: B

Explanation:

Reference: https://cloud.google.com/architecture/migration-to-google-cloud-transferring-your-large-datasets

Your company and one of its partners each nave a Google Cloud protect in separate organizations.

Your company s protect (prj-a) runs in Virtual Private Cloud (vpc-a). The partner's project (prj-b) runs in vpc-b. There are two instances running on vpc-a and one instance running on vpc-b Subnets denned in both VPCs are not overlapping.

You need to ensure that all instances communicate with each other via internal IPs minimizing latency and maximizing throughput. What should you do?

A.
Set up a network peering between vpc-a and vpc-b
A.
Set up a network peering between vpc-a and vpc-b
Answers
B.
Set up a VPN between vpc-a and vpc-b using Cloud VPN
B.
Set up a VPN between vpc-a and vpc-b using Cloud VPN
Answers
C.
Configure IAP TCP forwarding on the instance in vpc b and then launch the following gcloud command from one of the instance in vpc-gcloud:
C.
Configure IAP TCP forwarding on the instance in vpc b and then launch the following gcloud command from one of the instance in vpc-gcloud:
Answers
D.
Create an additional instance in vpc-a
D.
Create an additional instance in vpc-a
Answers
E.
Create an additional instance n vpc-b
E.
Create an additional instance n vpc-b
Answers
F.
Instal OpenVPN in newly created instances
F.
Instal OpenVPN in newly created instances
Answers
G.
Configure a VPN tunnel between vpc-a and vpc-b with the help of OpenVPN
G.
Configure a VPN tunnel between vpc-a and vpc-b with the help of OpenVPN
Answers
Suggested answer: C

Your company has an application running on Google Cloud that is collecting data from thousands ophysical devices that are globally distributed. Data is publish to Pub/Sub and streamed in real timinto an SSO Cloud Bigtable cluster via a Dataflow pipeline. The operations team informs you that youCloud Bigtable cluster has a hot-spot and queries are taking longer man expected You need to resolvthe problem and prevent it from happening in the future What should you do?

A.
Advise your clients to use HBase APIs instead of NodeJS APIs.
A.
Advise your clients to use HBase APIs instead of NodeJS APIs.
Answers
B.
Review your RowKey strategy and ensure that keys are evenly spread across the alphabet.
B.
Review your RowKey strategy and ensure that keys are evenly spread across the alphabet.
Answers
C.
Delete records older than 30 days.
C.
Delete records older than 30 days.
Answers
D.
Double the number of nodes you currently have.
D.
Double the number of nodes you currently have.
Answers
Suggested answer: B

Your company recently acquired a company that has infrastructure in Google Cloud. Each company has its own Google Cloud organization Each company is using a Shared Virtual Private Cloud (VPC) to provide network connectivity tor its applications Some of the subnets used by both companies overlap In order for both businesses to integrate, the applications need to have private network connectivity. These applications are not on overlapping subnets. You want to provide connectivity with minimal re-engineering. What should you do?

A.
Set up VPC peering and peer each Shared VPC together
A.
Set up VPC peering and peer each Shared VPC together
Answers
B.
Configure SSH port forwarding on each application to provide connectivity between applications is the different Shared VPCs
B.
Configure SSH port forwarding on each application to provide connectivity between applications is the different Shared VPCs
Answers
C.
Migrate the protects from the acquired company into your company's Google Cloud organization Re launch the instances in your companies Shared VPC
C.
Migrate the protects from the acquired company into your company's Google Cloud organization Re launch the instances in your companies Shared VPC
Answers
D.
Set up a Cloud VPN gateway in each Shared VPC and peer Cloud VPNs
D.
Set up a Cloud VPN gateway in each Shared VPC and peer Cloud VPNs
Answers
Suggested answer: B

Your operations team currently stores 10 TB of data m an object storage service from a third-party provider. They want to move this data to a Cloud Storage bucket as quickly as possible, following Google-recommended practices. They want to minimize the cost of this data migration. When approach should they use?

A.
Use the gsutil mv command lo move the data
A.
Use the gsutil mv command lo move the data
Answers
B.
Use the Storage Transfer Service to move the data
B.
Use the Storage Transfer Service to move the data
Answers
C.
Download the data to a Transfer Appliance and ship it to Google
C.
Download the data to a Transfer Appliance and ship it to Google
Answers
D.
Download the data to the on-premises data center and upload it to the Cloud Storage bucket
D.
Download the data to the on-premises data center and upload it to the Cloud Storage bucket
Answers
Suggested answer: B

Explanation:

https://cloud.google.com/architecture/migration-to-google-cloud-transferring-your-largedatasets#transfer-options

https://cloud.google.com/storage-transfer-service

Total 285 questions
Go to page: of 29