ExamGecko
Home Home / Google / Professional Cloud Architect

Google Professional Cloud Architect Practice Test - Questions Answers, Page 27

Question list
Search
Search

List of questions

Search

Related questions











Your development team has created a mobile game app. You want to test the new mobile app on Android and iOS devices with a variety of configurations. You need to ensure that testing is efficient and cost-effective. What should you do?

A.
Upload your mobile app to the Firebase Test Lab, and test the mobile app on Android and iOS devices.
A.
Upload your mobile app to the Firebase Test Lab, and test the mobile app on Android and iOS devices.
Answers
B.
Create Android and iOS VMs on Google Cloud, install the mobile app on the VMs, and test the mobile app.
B.
Create Android and iOS VMs on Google Cloud, install the mobile app on the VMs, and test the mobile app.
Answers
C.
Create Android and iOS containers on Google Kubernetes Engine (GKE), install the mobile app on the containers, and test the mobile app.
C.
Create Android and iOS containers on Google Kubernetes Engine (GKE), install the mobile app on the containers, and test the mobile app.
Answers
D.
Upload your mobile app with different configurations to Firebase Hosting and test each configuration.
D.
Upload your mobile app with different configurations to Firebase Hosting and test each configuration.
Answers
Suggested answer: C

Explanation:

TerramEarth's CTO wants to use the raw data from connected vehicles to help identify approximately when a vehicle in the field will have a catastrophic failure. You want to allow analysts to centrally query the vehicle data. Which architecture should you recommend?

A.
A.
Answers
B.
B.
Answers
C.
C.
Answers
D.
D.
Answers
Suggested answer: A

Explanation:

The push endpoint can be a load balancer. A container cluster can be used.

Cloud Pub/Sub for Stream Analytics

References: https://cloud.google.com/pubsub/ https://cloud.google.com/solutions/iot/

https://cloud.google.com/solutions/designing-connected-vehicle-platform

https://cloud.google.com/solutions/designing-connected-vehicle-platform#data_ingestion

http://www.eweek.com/big-data-and-analytics/google-touts-value-of-cloud-iot-core-for-analyzing-connected-car-data https://cloud.google.com/solutions/iot/

The TerramEarth development team wants to create an API to meet the company's business requirements. You want the development team to focus their development effort on business value versus creating a custom framework.

Which method should they use?

A.
Use Google App Engine with Google Cloud Endpoints. Focus on an API for dealers and partners
A.
Use Google App Engine with Google Cloud Endpoints. Focus on an API for dealers and partners
Answers
B.
Use Google App Engine with a JAX-RS Jersey Java-based framework. Focus on an API for the public
B.
Use Google App Engine with a JAX-RS Jersey Java-based framework. Focus on an API for the public
Answers
C.
Use Google App Engine with the Swagger (Open API Specification) framework. Focus on an API for the public
C.
Use Google App Engine with the Swagger (Open API Specification) framework. Focus on an API for the public
Answers
D.
Use Google Container Engine with a Django Python container. Focus on an API for the public
D.
Use Google Container Engine with a Django Python container. Focus on an API for the public
Answers
E.
Use Google Container Engine with a Tomcat container with the Swagger (Open API Specification) framework. Focus on an API for dealers and partners
E.
Use Google Container Engine with a Tomcat container with the Swagger (Open API Specification) framework. Focus on an API for dealers and partners
Answers
Suggested answer: A

Explanation:

Develop, deploy, protect and monitor your APIs with Google Cloud Endpoints. Using an Open API Specification or one of our API frameworks, Cloud Endpoints gives you the tools you need for every phase of API development.

From scenario:

Business Requirements

Decrease unplanned vehicle downtime to less than 1 week, without increasing the cost of carrying surplus inventory

Support the dealer network with more data on how their customers use their equipment to better position new products and services Have the ability to partner with different companies - especially with seed and fertilizer suppliers in the fast-growing agricultural business - to create compelling joint offerings for their customers.

Reference: https://cloud.google.com/certification/guides/cloud-architect/casestudy-terramearth

Your development team has created a structured API to retrieve vehicle data. They want to allow third parties to develop tools for dealerships that use this vehicle event data. You want to support delegated authorization against this data.

What should you do?

A.
Build or leverage an OAuth-compatible access control system
A.
Build or leverage an OAuth-compatible access control system
Answers
B.
Build SAML 2.0 SSO compatibility into your authentication system
B.
Build SAML 2.0 SSO compatibility into your authentication system
Answers
C.
Restrict data access based on the source IP address of the partner systems
C.
Restrict data access based on the source IP address of the partner systems
Answers
D.
Create secondary credentials for each dealer that can be given to the trusted third party
D.
Create secondary credentials for each dealer that can be given to the trusted third party
Answers
Suggested answer: A

Explanation:

Delegate application authorization with OAuth2

Cloud Platform APIs support OAuth 2.0, and scopes provide granular authorization over the methods that are supported. Cloud Platform supports both service-account and user-account OAuth, also called three-legged OAuth.

References: https://cloud.google.com/docs/enterprise/best-practices-for-enterprise-organizations#delegate_application_authorization_with_oauth2 https://cloud.google.com/appengine/docs/flexible/go/authorizing-apps

TerramEarth plans to connect all 20 million vehicles in the field to the cloud. This increases the volume to 20 million 600 byte records a second for 40 TB an hour.

How should you design the data ingestion?

A.
Vehicles write data directly to GCS
A.
Vehicles write data directly to GCS
Answers
B.
Vehicles write data directly to Google Cloud Pub/Sub
B.
Vehicles write data directly to Google Cloud Pub/Sub
Answers
C.
Vehicles stream data directly to Google BigQuery
C.
Vehicles stream data directly to Google BigQuery
Answers
D.
Vehicles continue to write data using the existing system (FTP)
D.
Vehicles continue to write data using the existing system (FTP)
Answers
Suggested answer: C

You analyzed TerramEarth's business requirement to reduce downtime, and found that they can achieve a majority of time saving by reducing customer's wait time for parts. You decided to focus on reduction of the 3 weeks aggregate reporting time.

Which modifications to the company's processes should you recommend?

A.
Migrate from CSV to binary format, migrate from FTP to SFTP transport, and develop machine learning analysis of metrics
A.
Migrate from CSV to binary format, migrate from FTP to SFTP transport, and develop machine learning analysis of metrics
Answers
B.
Migrate from FTP to streaming transport, migrate from CSV to binary format, and develop machine learning analysis of metrics
B.
Migrate from FTP to streaming transport, migrate from CSV to binary format, and develop machine learning analysis of metrics
Answers
C.
Increase fleet cellular connectivity to 80%, migrate from FTP to streaming transport, and develop machine learning analysis of metrics
C.
Increase fleet cellular connectivity to 80%, migrate from FTP to streaming transport, and develop machine learning analysis of metrics
Answers
D.
Migrate from FTP to SFTP transport, develop machine learning analysis of metrics, and increase dealer local inventory by a fixed factor
D.
Migrate from FTP to SFTP transport, develop machine learning analysis of metrics, and increase dealer local inventory by a fixed factor
Answers
Suggested answer: C

Explanation:

The Avro binary format is the preferred format for loading compressed data. Avro data is faster to load because the data can be read in parallel, even when the data blocks are compressed.Cloud Storage supports streaming transfers with the gsutil tool or boto library, based on HTTP chunked transfer encoding. Streaming data lets you stream data to and from your Cloud Storage account as soon as it becomes available without requiring that the data be first saved to a separate file. Streaming transfers are useful if you have a process that generates data and you do not want to buffer it locally before uploading it, or if you want to send the result from a computational pipeline directly into Cloud

Storage.References: https://cloud.google.com/storage/docs/streaming https://cloud.google.com/bigquery/docs/loading-data

Which of TerramEarth's legacy enterprise processes will experience significant change as a result of increased Google Cloud Platform adoption?

A.
Opex/capex allocation, LAN changes, capacity planning
A.
Opex/capex allocation, LAN changes, capacity planning
Answers
B.
Capacity planning, TCO calculations, opex/capex allocation
B.
Capacity planning, TCO calculations, opex/capex allocation
Answers
C.
Capacity planning, utilization measurement, data center expansion
C.
Capacity planning, utilization measurement, data center expansion
Answers
D.
Data Center expansion, TCO calculations, utilization measurement
D.
Data Center expansion, TCO calculations, utilization measurement
Answers
Suggested answer: B

To speed up data retrieval, more vehicles will be upgraded to cellular connections and be able to transmit data to the ETL process. The current FTP process is error-prone and restarts the data transfer from the start of the file when connections fail, which happens often. You want to improve the reliability of the solution and minimize data transfer time on the cellular connections.

What should you do?

A.
Use one Google Container Engine cluster of FTP servers. Save the data to a Multi-Regional bucket. Run the ETL process using data in the bucket
A.
Use one Google Container Engine cluster of FTP servers. Save the data to a Multi-Regional bucket. Run the ETL process using data in the bucket
Answers
B.
Use multiple Google Container Engine clusters running FTP servers located in different regions. Save the data to Multi-Regional buckets in US, EU, and Asia. Run the ETL process using the data in the bucket
B.
Use multiple Google Container Engine clusters running FTP servers located in different regions. Save the data to Multi-Regional buckets in US, EU, and Asia. Run the ETL process using the data in the bucket
Answers
C.
Directly transfer the files to different Google Cloud Multi-Regional Storage bucket locations in US, EU, and Asia using Google APIs over HTTP(S). Run the ETL process using the data in the bucket
C.
Directly transfer the files to different Google Cloud Multi-Regional Storage bucket locations in US, EU, and Asia using Google APIs over HTTP(S). Run the ETL process using the data in the bucket
Answers
D.
Directly transfer the files to a different Google Cloud Regional Storage bucket location in US, EU, and Asia using Google APIs over HTTP(S). Run the ETL process to retrieve the data from each Regional bucket
D.
Directly transfer the files to a different Google Cloud Regional Storage bucket location in US, EU, and Asia using Google APIs over HTTP(S). Run the ETL process to retrieve the data from each Regional bucket
Answers
Suggested answer: D

TerramEarth's 20 million vehicles are scattered around the world. Based on the vehicle's location, its telemetry data is stored in a Google Cloud Storage (GCS) regional bucket (US, Europe, or Asia). The CTO has asked you to run a report on the raw telemetry data to determine why vehicles are breaking down after 100 K miles. You want to run this job on all the data.

What is the most cost-effective way to run this job?

A.
Move all the data into 1 zone, then launch a Cloud Dataproc cluster to run the job
A.
Move all the data into 1 zone, then launch a Cloud Dataproc cluster to run the job
Answers
B.
Move all the data into 1 region, then launch a Google Cloud Dataproc cluster to run the job
B.
Move all the data into 1 region, then launch a Google Cloud Dataproc cluster to run the job
Answers
C.
Launch a cluster in each region to preprocess and compress the raw data, then move the data into a multi-region bucket and use a Dataproc cluster to finish the job
C.
Launch a cluster in each region to preprocess and compress the raw data, then move the data into a multi-region bucket and use a Dataproc cluster to finish the job
Answers
D.
Launch a cluster in each region to preprocess and compress the raw data, then move the data into a region bucket and use a Cloud Dataproc cluster to finish the job
D.
Launch a cluster in each region to preprocess and compress the raw data, then move the data into a region bucket and use a Cloud Dataproc cluster to finish the job
Answers
Suggested answer: D

TerramEarth has equipped all connected trucks with servers and sensors to collect telemetry data. Next year they want to use the data to train machine learning models. They want to store this data in the cloud while reducing costs.

What should they do?

A.
Have the vehicle's computer compress the data in hourly snapshots, and store it in a Google Cloud Storage (GCS) Nearline bucket
A.
Have the vehicle's computer compress the data in hourly snapshots, and store it in a Google Cloud Storage (GCS) Nearline bucket
Answers
B.
Push the telemetry data in real-time to a streaming dataflow job that compresses the data, and store it in Google BigQuery
B.
Push the telemetry data in real-time to a streaming dataflow job that compresses the data, and store it in Google BigQuery
Answers
C.
Push the telemetry data in real-time to a streaming dataflow job that compresses the data, and store it in Cloud Bigtable
C.
Push the telemetry data in real-time to a streaming dataflow job that compresses the data, and store it in Cloud Bigtable
Answers
D.
Have the vehicle's computer compress the data in hourly snapshots, and store it in a GCS Coldline bucket
D.
Have the vehicle's computer compress the data in hourly snapshots, and store it in a GCS Coldline bucket
Answers
Suggested answer: D

Explanation:

Storage is the best choice for data that you plan to access at most once a year, due to its slightly lower availability, 90-day minimum storage duration, costs for data access, and higher per-operation costs. For example:

Cold Data Storage - Infrequently accessed data, such as data stored for legal or regulatory reasons, can be stored at low cost as Coldline Storage, and be available when you need it.

Disaster recovery - In the event of a disaster recovery event, recovery time is key. Cloud Storage provides low latency access to data stored as Coldline Storage. References: https://cloud.google.com/storage/docs/storage-classes

Total 285 questions
Go to page: of 29