List of questions
Related questions
Question 127 - Professional Cloud DevOps Engineer discussion
Your team is building a service that performs compute-heavy processing on batches of data The data is processed faster based on the speed and number of CPUs on the machine These batches of data vary in size and may arrive at any time from multiple third-party sources You need to ensure that third parties are able to upload their data securely. You want to minimize costs while ensuring that the data is processed as quickly as possible What should you do?
A.
* Provide a secure file transfer protocol (SFTP) server on a Compute Engine instance so that third parties can upload batches of data and provide appropriate credentials to the server * Create a Cloud Function with a google.storage, object, finalize Cloud Storage trigger Write code so that the function can scale up a Compute Engine autoscaling managed instance group * Use an image pre-loaded with the data processing software that terminates the instances when processing completes
B.
* Provide a Cloud Storage bucket so that third parties can upload batches of data, and provide appropriate Identity and Access Management (1AM) access to the bucket * Use a standard Google Kubernetes Engine (GKE) cluster and maintain two services one that processes the batches of data and one that monitors Cloud Storage for new batches of data * Stop the processing service when there are no batches of data to process
C.
* Provide a Cloud Storage bucket so that third parties can upload batches of data, and provide appropriate identity and Access Management (1AM) access to the bucket * Create a Cloud Function with a google, storage, object .finalise Cloud Storage trigger Write code so that the function can scale up a Compute Engine autoscaling managed instance group * Use an image pre-loaded with the data processing software that terminates the instances when processing completes
D.
* Provide a Cloud Storage bucket so that third parties can upload batches of data, and provide appropriate Identity and Access Management (1AM) access to the bucket * Use Cloud Monitoring to detect new batches of data in the bucket and trigger a Cloud Function that processes the data * Set a Cloud Function to use the largest CPU possible to minimize the runtime of the processing
Your answer:
0 comments
Sorted by
Leave a comment first