ExamGecko
Home Home / Google / Professional Cloud Architect

Google Professional Cloud Architect Practice Test - Questions Answers, Page 28

Question list
Search
Search

List of questions

Search

Related questions











Your agricultural division is experimenting with fully autonomous vehicles. You want your architecture to promote strong security during vehicle operation. Which two architectures should you consider? (Choose two.)

A.
Treat every micro service call between modules on the vehicle as untrusted.
A.
Treat every micro service call between modules on the vehicle as untrusted.
Answers
B.
Require IPv6 for connectivity to ensure a secure address space.
B.
Require IPv6 for connectivity to ensure a secure address space.
Answers
C.
Use a trusted platform module (TPM) and verify firmware and binaries on boot.
C.
Use a trusted platform module (TPM) and verify firmware and binaries on boot.
Answers
D.
Use a functional programming language to isolate code execution cycles.
D.
Use a functional programming language to isolate code execution cycles.
Answers
E.
Use multiple connectivity subsystems for redundancy.
E.
Use multiple connectivity subsystems for redundancy.
Answers
F.
Enclose the vehicle's drive electronics in a Faraday cage to isolate chips.
F.
Enclose the vehicle's drive electronics in a Faraday cage to isolate chips.
Answers
Suggested answer: A, C

Operational parameters such as oil pressure are adjustable on each of TerramEarth's vehicles to increase their efficiency, depending on their environmental conditions. Your primary goal is to increase the operating efficiency of all 20 million cellular and unconnected vehicles in the field.

How can you accomplish this goal?

A.
Have you engineers inspect the data for patterns, and then create an algorithm with rules that make operational adjustments automatically
A.
Have you engineers inspect the data for patterns, and then create an algorithm with rules that make operational adjustments automatically
Answers
B.
Capture all operating data, train machine learning models that identify ideal operations, and run locally to make operational adjustments automatically
B.
Capture all operating data, train machine learning models that identify ideal operations, and run locally to make operational adjustments automatically
Answers
C.
Implement a Google Cloud Dataflow streaming job with a sliding window, and use Google Cloud Messaging (GCM) to make operational adjustments automatically
C.
Implement a Google Cloud Dataflow streaming job with a sliding window, and use Google Cloud Messaging (GCM) to make operational adjustments automatically
Answers
D.
Capture all operating data, train machine learning models that identify ideal operations, and host in Google Cloud Machine Learning (ML) Platform to make operational adjustments automatically
D.
Capture all operating data, train machine learning models that identify ideal operations, and host in Google Cloud Machine Learning (ML) Platform to make operational adjustments automatically
Answers
Suggested answer: B

Explanation:

For this question, refer to the TerramEarth case study. To be compliant with European GDPR regulation, TerramEarth is required to delete data generated from its European customers after a period of 36 months when it contains personal data. In the new architecture, this data will be stored in both Cloud Storage and BigQuery. What should you do?

A.
Create a BigQuery table for the European data, and set the table retention period to 36 months. For Cloud Storage, use gsutil to enable lifecycle management using a DELETE action with an Age condition of 36 months.
A.
Create a BigQuery table for the European data, and set the table retention period to 36 months. For Cloud Storage, use gsutil to enable lifecycle management using a DELETE action with an Age condition of 36 months.
Answers
B.
Create a BigQuery table for the European data, and set the table retention period to 36 months. For Cloud Storage, use gsutil to create a SetStorageClass to NONE action when with an Age condition of 36 months.
B.
Create a BigQuery table for the European data, and set the table retention period to 36 months. For Cloud Storage, use gsutil to create a SetStorageClass to NONE action when with an Age condition of 36 months.
Answers
C.
Create a BigQuery time-partitioned table for the European data, and set the partition expiration period to 36 months. For Cloud Storage, use gsutil to enable lifecycle management using a DELETE action with an Age condition of 36 months.
C.
Create a BigQuery time-partitioned table for the European data, and set the partition expiration period to 36 months. For Cloud Storage, use gsutil to enable lifecycle management using a DELETE action with an Age condition of 36 months.
Answers
D.
Create a BigQuery time-partitioned table for the European data, and set the partition period to 36 months. For Cloud Storage, use gsutil to create a SetStorageClass to NONE action with an Age condition of 36 months.
D.
Create a BigQuery time-partitioned table for the European data, and set the partition period to 36 months. For Cloud Storage, use gsutil to create a SetStorageClass to NONE action with an Age condition of 36 months.
Answers
Suggested answer: C

For this question, refer to the TerramEarth case study. TerramEarth has decided to store data files in Cloud Storage. You need to configure Cloud Storage lifecycle rule to store 1 year of data and minimize file storage cost. Which two actions should you take?

A.
Create a Cloud Storage lifecycle rule with Age: "30", Storage Class: "Standard", and Action: "Set to Coldline", and create a second GCS life-cycle rule with Age: "365", Storage Class: "Coldline", and Action: "Delete".
A.
Create a Cloud Storage lifecycle rule with Age: "30", Storage Class: "Standard", and Action: "Set to Coldline", and create a second GCS life-cycle rule with Age: "365", Storage Class: "Coldline", and Action: "Delete".
Answers
B.
Create a Cloud Storage lifecycle rule with Age: "30", Storage Class: "Coldline", and Action: "Set to Nearline", and create a second GCS life-cycle rule with Age: "91", Storage Class: "Coldline", and Action: "Set to Nearline".
B.
Create a Cloud Storage lifecycle rule with Age: "30", Storage Class: "Coldline", and Action: "Set to Nearline", and create a second GCS life-cycle rule with Age: "91", Storage Class: "Coldline", and Action: "Set to Nearline".
Answers
C.
Create a Cloud Storage lifecycle rule with Age: "90", Storage Class: "Standard", and Action: "Set to Nearline", and create a second GCS life-cycle rule with Age: "91", Storage Class: "Nearline", and Action: "Set to Coldline".
C.
Create a Cloud Storage lifecycle rule with Age: "90", Storage Class: "Standard", and Action: "Set to Nearline", and create a second GCS life-cycle rule with Age: "91", Storage Class: "Nearline", and Action: "Set to Coldline".
Answers
D.
Create a Cloud Storage lifecycle rule with Age: "30", Storage Class: "Standard", and Action: "Set to Coldline", and create a second GCS life-cycle rule with Age: "365", Storage Class: "Nearline", and Action: "Delete".
D.
Create a Cloud Storage lifecycle rule with Age: "30", Storage Class: "Standard", and Action: "Set to Coldline", and create a second GCS life-cycle rule with Age: "365", Storage Class: "Nearline", and Action: "Delete".
Answers
Suggested answer: A

For this question, refer to the TerramEarth case study. You need to implement a reliable, scalable GCP solution for the data warehouse for your company, TerramEarth. Considering the TerramEarth business and technical requirements, what should you do?

A.
Replace the existing data warehouse with BigQuery. Use table partitioning.
A.
Replace the existing data warehouse with BigQuery. Use table partitioning.
Answers
B.
Replace the existing data warehouse with a Compute Engine instance with 96 CPUs.
B.
Replace the existing data warehouse with a Compute Engine instance with 96 CPUs.
Answers
C.
Replace the existing data warehouse with BigQuery. Use federated data sources.
C.
Replace the existing data warehouse with BigQuery. Use federated data sources.
Answers
D.
Replace the existing data warehouse with a Compute Engine instance with 96 CPUs. Add an additional Compute Engine pre-emptible instance with 32 CPUs.
D.
Replace the existing data warehouse with a Compute Engine instance with 96 CPUs. Add an additional Compute Engine pre-emptible instance with 32 CPUs.
Answers
Suggested answer: A

For this question, refer to the TerramEarth case study. A new architecture that writes all incoming data to BigQuery has been introduced. You notice that the data is dirty, and want to ensure data quality on an automated daily basis while managing cost.

What should you do?

A.
Set up a streaming Cloud Dataflow job, receiving data by the ingestion process. Clean the data in a Cloud Dataflow pipeline.
A.
Set up a streaming Cloud Dataflow job, receiving data by the ingestion process. Clean the data in a Cloud Dataflow pipeline.
Answers
B.
Create a Cloud Function that reads data from BigQuery and cleans it. Trigger the Cloud Function from a Compute Engine instance.
B.
Create a Cloud Function that reads data from BigQuery and cleans it. Trigger the Cloud Function from a Compute Engine instance.
Answers
C.
Create a SQL statement on the data in BigQuery, and save it as a view. Run the view daily, and save the result to a new table.
C.
Create a SQL statement on the data in BigQuery, and save it as a view. Run the view daily, and save the result to a new table.
Answers
D.
Use Cloud Dataprep and configure the BigQuery tables as the source. Schedule a daily job to clean the data.
D.
Use Cloud Dataprep and configure the BigQuery tables as the source. Schedule a daily job to clean the data.
Answers
Suggested answer: D

For this question, refer to the TerramEarth case study. Considering the technical requirements, how should you reduce the unplanned vehicle downtime in GCP?

A.
Use BigQuery as the data warehouse. Connect all vehicles to the network and stream data into BigQuery using Cloud Pub/Sub and Cloud Dataflow. Use Google Data Studio for analysis and reporting.
A.
Use BigQuery as the data warehouse. Connect all vehicles to the network and stream data into BigQuery using Cloud Pub/Sub and Cloud Dataflow. Use Google Data Studio for analysis and reporting.
Answers
B.
Use BigQuery as the data warehouse. Connect all vehicles to the network and upload gzip files to a Multi-Regional Cloud Storage bucket using gcloud. Use Google Data Studio for analysis and reporting.
B.
Use BigQuery as the data warehouse. Connect all vehicles to the network and upload gzip files to a Multi-Regional Cloud Storage bucket using gcloud. Use Google Data Studio for analysis and reporting.
Answers
C.
Use Cloud Dataproc Hive as the data warehouse. Upload gzip files to a MultiRegional Cloud Storage bucket. Upload this data into BigQuery using gcloud. Use Google data Studio for analysis and reporting.
C.
Use Cloud Dataproc Hive as the data warehouse. Upload gzip files to a MultiRegional Cloud Storage bucket. Upload this data into BigQuery using gcloud. Use Google data Studio for analysis and reporting.
Answers
D.
Use Cloud Dataproc Hive as the data warehouse. Directly stream data into partitioned Hive tables. Use Pig scripts to analyze data.
D.
Use Cloud Dataproc Hive as the data warehouse. Directly stream data into partitioned Hive tables. Use Pig scripts to analyze data.
Answers
Suggested answer: A

For this question, refer to the TerramEarth case study. You need to implement a reliable, scalable GCP solution for the data warehouse for your company, TerramEarth.

Considering the TerramEarth business and technical requirements, what should you do?

A.
Replace the existing data warehouse with BigQuery. Use table partitioning.
A.
Replace the existing data warehouse with BigQuery. Use table partitioning.
Answers
B.
Replace the existing data warehouse with a Compute Engine instance with 96 CPUs.
B.
Replace the existing data warehouse with a Compute Engine instance with 96 CPUs.
Answers
C.
Replace the existing data warehouse with BigQuery. Use federated data sources.
C.
Replace the existing data warehouse with BigQuery. Use federated data sources.
Answers
D.
Replace the existing data warehouse with a Compute Engine instance with 96 CPUs. Add an additional Compute Engine pre-emptible instance with 32 CPUs.
D.
Replace the existing data warehouse with a Compute Engine instance with 96 CPUs. Add an additional Compute Engine pre-emptible instance with 32 CPUs.
Answers
Suggested answer: A

For this question, refer to the TerramEarth case study. You are asked to design a new architecture for the ingestion of the data of the 200,000 vehicles that are connected to a cellular network. You want to follow Google-recommended practices.

Considering the technical requirements, which components should you use for the ingestion of the data?

A.
Google Kubernetes Engine with an SSL Ingress
A.
Google Kubernetes Engine with an SSL Ingress
Answers
B.
Cloud IoT Core with public/private key pairs
B.
Cloud IoT Core with public/private key pairs
Answers
C.
Compute Engine with project-wide SSH keys
C.
Compute Engine with project-wide SSH keys
Answers
D.
Compute Engine with specific SSH keys
D.
Compute Engine with specific SSH keys
Answers
Suggested answer: B

Explanation:

For this question, refer to the TerramEarth case study. You start to build a new application that uses a few Cloud Functions for the backend. One use case requires a Cloud Function func_display to invoke another Cloud Function func_query. You want func_query only to accept invocations from func_display. You also want to follow Google's recommended best practices. What should you do?

A.
Create a token and pass it in as an environment variable to func_display. When invoking func_query, include the token in the request. Pass the same token to func_query and reject the invocation if the tokens are different.
A.
Create a token and pass it in as an environment variable to func_display. When invoking func_query, include the token in the request. Pass the same token to func_query and reject the invocation if the tokens are different.
Answers
B.
Make func_query 'Require authentication.' Create a unique service account and associate it to func_display. Grant the service account invoker role for func_query. Create an id token in func_display and include the token to the request when invoking func_query.
B.
Make func_query 'Require authentication.' Create a unique service account and associate it to func_display. Grant the service account invoker role for func_query. Create an id token in func_display and include the token to the request when invoking func_query.
Answers
C.
Make func_query 'Require authentication' and only accept internal traffic. Create those two functions in the same VPC. Create an ingress firewall rule for func_query to only allow traffic from func_display.
C.
Make func_query 'Require authentication' and only accept internal traffic. Create those two functions in the same VPC. Create an ingress firewall rule for func_query to only allow traffic from func_display.
Answers
D.
Create those two functions in the same project and VPC. Make func_query only accept internal traffic. Create an ingress firewall for func_query to only allow traffic from func_display. Also, make sure both functions use the same service account.
D.
Create those two functions in the same project and VPC. Make func_query only accept internal traffic. Create an ingress firewall for func_query to only allow traffic from func_display. Also, make sure both functions use the same service account.
Answers
Suggested answer: B
Total 285 questions
Go to page: of 29