ExamGecko
Home Home / Google / Professional Cloud Developer

Google Professional Cloud Developer Practice Test - Questions Answers, Page 18

Question list
Search
Search

List of questions

Search

Related questions











You are a developer working on an internal application for payroll processing. You are building a component of the application that allows an employee to submit a timesheet, which then initiates several steps:

* An email is sent to the employee and manager, notifying them that the timesheet was submitted.

* A timesheet is sent to payroll processing for the vendor's API.

* A timesheet is sent to the data warehouse for headcount planning.

These steps are not dependent on each other and can be completed in any order. New steps are being considered and will be implemented by different development teams. Each development team will implement the error handling specific to their step. What should you do?

A.
Deploy a Cloud Function for each step that calls the corresponding downstream system to complete the required action.
A.
Deploy a Cloud Function for each step that calls the corresponding downstream system to complete the required action.
Answers
B.
Create a Pub/Sub topic for each step. Create a subscription for each downstream development team to subscribe to their step's topic.
B.
Create a Pub/Sub topic for each step. Create a subscription for each downstream development team to subscribe to their step's topic.
Answers
C.
Create a Pub/Sub topic for timesheet submissions. Create a subscription for each downstream development team to subscribe to the topic.
C.
Create a Pub/Sub topic for timesheet submissions. Create a subscription for each downstream development team to subscribe to the topic.
Answers
D.
Create a timesheet microservice deployed to Google Kubernetes Engine. The microservice calls each downstream step and waits for a successful response before calling the next step.
D.
Create a timesheet microservice deployed to Google Kubernetes Engine. The microservice calls each downstream step and waits for a successful response before calling the next step.
Answers
Suggested answer: C

You are designing an application that uses a microservices architecture. You are planning to deploy the application in the cloud and on-premises. You want to make sure the application can scale up on demand and also use managed services as much as possible. What should you do?

A.
Deploy open source Istio in a multi-cluster deployment on multiple Google Kubernetes Engine (GKE) clusters managed by Anthos.
A.
Deploy open source Istio in a multi-cluster deployment on multiple Google Kubernetes Engine (GKE) clusters managed by Anthos.
Answers
B.
Create a GKE cluster in each environment with Anthos, and use Cloud Run for Anthos to deploy your application to each cluster.
B.
Create a GKE cluster in each environment with Anthos, and use Cloud Run for Anthos to deploy your application to each cluster.
Answers
C.
Install a GKE cluster in each environment with Anthos, and use Cloud Build to create a Deployment for your application in each cluster.
C.
Install a GKE cluster in each environment with Anthos, and use Cloud Build to create a Deployment for your application in each cluster.
Answers
D.
Create a GKE cluster in the cloud and install open-source Kubernetes on-premises. Use an external load balancer service to distribute traffic across the two environments.
D.
Create a GKE cluster in the cloud and install open-source Kubernetes on-premises. Use an external load balancer service to distribute traffic across the two environments.
Answers
Suggested answer: B

Explanation:

https://cloud.google.com/anthos/run

Integrated with Anthos, Cloud Run for Anthos provides a flexible serverless development platform for hybrid and multicloud environments. Cloud Run for Anthos is Google's managed and fully supported Knative offering, an open source project that enables serverless workloads on Kubernetes.

You want to migrate an on-premises container running in Knative to Google Cloud. You need to make sure that the migration doesn't affect your application's deployment strategy, and you want to use a fully managed service. Which Google Cloud service should you use to deploy your container?

A.
Cloud Run
A.
Cloud Run
Answers
B.
Compute Engine
B.
Compute Engine
Answers
C.
Google Kubernetes Engine
C.
Google Kubernetes Engine
Answers
D.
App Engine flexible environment
D.
App Engine flexible environment
Answers
Suggested answer: A

Explanation:

https://cloud.google.com/blog/products/serverless/knative-based-cloud-run-services-are-ga

This architectural diagram depicts a system that streams data from thousands of devices. You want to ingest data into a pipeline, store the data, and analyze the data using SQL statements. Which Google Cloud services should you use for steps 1, 2, 3, and 4?

A.
1) App Engine 2) Pub/Sub 3) BigQuery 4) Firestore
A.
1) App Engine 2) Pub/Sub 3) BigQuery 4) Firestore
Answers
B.
1) Dataflow 2) Pub/Sub 3) Firestore 4) BigQuery
B.
1) Dataflow 2) Pub/Sub 3) Firestore 4) BigQuery
Answers
C.
1) Pub/Sub 2) Dataflow 3) BigQuery 4) Firestore
C.
1) Pub/Sub 2) Dataflow 3) BigQuery 4) Firestore
Answers
D.
1) Pub/Sub 2) Dataflow 3) Firestore 4) BigQuery
D.
1) Pub/Sub 2) Dataflow 3) Firestore 4) BigQuery
Answers
Suggested answer: D

You are developing an application that consists of several microservices running in a Google Kubernetes Engine cluster. One microservice needs to connect to a third-party database running on-premises. You need to store credentials to the database and ensure that these credentials can be rotated while following security best practices. What should you do?

A.
Store the credentials in a sidecar container proxy, and use it to connect to the third-party database.
A.
Store the credentials in a sidecar container proxy, and use it to connect to the third-party database.
Answers
B.
Configure a service mesh to allow or restrict traffic from the Pods in your microservice to the database.
B.
Configure a service mesh to allow or restrict traffic from the Pods in your microservice to the database.
Answers
C.
Store the credentials in an encrypted volume mount, and associate a Persistent Volume Claim with the client Pod.
C.
Store the credentials in an encrypted volume mount, and associate a Persistent Volume Claim with the client Pod.
Answers
D.
Store the credentials as a Kubernetes Secret, and use the Cloud Key Management Service plugin to handle encryption and decryption.
D.
Store the credentials as a Kubernetes Secret, and use the Cloud Key Management Service plugin to handle encryption and decryption.
Answers
Suggested answer: D

Explanation:

https://cloud.google.com/kubernetes-engine/docs/how-to/encrypting-secrets

By default, Google Kubernetes Engine (GKE) encrypts customer content stored at rest, including Secrets. GKE handles and manages this default encryption for you without any additional action on your part.

Application-layer secrets encryption provides an additional layer of security for sensitive data, such as Secrets, stored in etcd. Using this functionality, you can use a key managed with Cloud KMS to encrypt data at the application layer. This encryption protects against attackers who gain access to an offline copy of etcd.

You are deploying a microservices application to Google Kubernetes Engine (GKE). The application will receive daily updates. You expect to deploy a large number of distinct containers that will run on the Linux operating system (OS). You want to be alerted to any known OS vulnerabilities in the new containers. You want to follow Google-recommended best practices. What should you do?

A.
Use the gcloud CLI to call Container Analysis to scan new container images. Review the vulnerability results before each deployment.
A.
Use the gcloud CLI to call Container Analysis to scan new container images. Review the vulnerability results before each deployment.
Answers
B.
Enable Container Analysis, and upload new container images to Artifact Registry. Review the vulnerability results before each deployment.
B.
Enable Container Analysis, and upload new container images to Artifact Registry. Review the vulnerability results before each deployment.
Answers
C.
Enable Container Analysis, and upload new container images to Artifact Registry. Review the critical vulnerability results before each deployment.
C.
Enable Container Analysis, and upload new container images to Artifact Registry. Review the critical vulnerability results before each deployment.
Answers
D.
Use the Container Analysis REST API to call Container Analysis to scan new container images. Review the vulnerability results before each deployment.
D.
Use the Container Analysis REST API to call Container Analysis to scan new container images. Review the vulnerability results before each deployment.
Answers
Suggested answer: D

Explanation:

https://cloud.google.com/container-analysis/docs/automated-scanning-howto

https://cloud.google.com/container-analysis/docs/os-overview says: The Container Scanning API allows you to automate OS vulnerability detection, scanning each time you push an image to Container Registry or Artifact Registry. Enabling this API also triggers language package scans for Go and Java vulnerabilities (Preview).

You are a developer at a large organization. You have an application written in Go running in a production Google Kubernetes Engine (GKE) cluster. You need to add a new feature that requires access to BigQuery. You want to grant BigQuery access to your GKE cluster following Google-recommended best practices. What should you do?

A.
Create a Google service account with BigQuery access. Add the JSON key to Secret Manager, and use the Go client library to access the JSON key.
A.
Create a Google service account with BigQuery access. Add the JSON key to Secret Manager, and use the Go client library to access the JSON key.
Answers
B.
Create a Google service account with BigQuery access. Add the Google service account JSON key as a Kubernetes secret, and configure the application to use this secret.
B.
Create a Google service account with BigQuery access. Add the Google service account JSON key as a Kubernetes secret, and configure the application to use this secret.
Answers
C.
Create a Google service account with BigQuery access. Add the Google service account JSON key to Secret Manager, and use an init container to access the secret for the application to use.
C.
Create a Google service account with BigQuery access. Add the Google service account JSON key to Secret Manager, and use an init container to access the secret for the application to use.
Answers
D.
Create a Google service account and a Kubernetes service account. Configure Workload Identity on the GKE cluster, and reference the Kubernetes service account on the application Deployment.
D.
Create a Google service account and a Kubernetes service account. Configure Workload Identity on the GKE cluster, and reference the Kubernetes service account on the application Deployment.
Answers
Suggested answer: D

Explanation:

https://cloud.google.com/kubernetes-engine/docs/concepts/workload-identity#what_is

Applications running on GKE might need access to Google Cloud APIs such as Compute Engine API, BigQuery Storage API, or Machine Learning APIs.

Workload Identity allows a Kubernetes service account in your GKE cluster to act as an IAM service account. Pods that use the configured Kubernetes service account automatically authenticate as the IAM service account when accessing Google Cloud APIs. Using Workload Identity allows you to assign distinct, fine-grained identities and authorization for each application in your cluster.

You have an application written in Python running in production on Cloud Run. Your application needs to read/write data stored in a Cloud Storage bucket in the same project. You want to grant access to your application following the principle of least privilege. What should you do?

A.
Create a user-managed service account with a custom Identity and Access Management (IAM) role.
A.
Create a user-managed service account with a custom Identity and Access Management (IAM) role.
Answers
B.
Create a user-managed service account with the Storage Admin Identity and Access Management (IAM) role.
B.
Create a user-managed service account with the Storage Admin Identity and Access Management (IAM) role.
Answers
C.
Create a user-managed service account with the Project Editor Identity and Access Management (IAM) role.
C.
Create a user-managed service account with the Project Editor Identity and Access Management (IAM) role.
Answers
D.
Use the default service account linked to the Cloud Run revision in production.
D.
Use the default service account linked to the Cloud Run revision in production.
Answers
Suggested answer: A

Explanation:

https://cloud.google.com/iam/docs/understanding-roles#storage.admin

Your team is developing unit tests for Cloud Function code. The code is stored in a Cloud Source Repositories repository. You are responsible for implementing the tests. Only a specific service account has the necessary permissions to deploy the code to Cloud Functions. You want to ensure that the code cannot be deployed without first passing the tests. How should you configure the unit testing process?

A.
Configure Cloud Build to deploy the Cloud Function. If the code passes the tests, a deployment approval is sent to you.
A.
Configure Cloud Build to deploy the Cloud Function. If the code passes the tests, a deployment approval is sent to you.
Answers
B.
Configure Cloud Build to deploy the Cloud Function, using the specific service account as the build agent. Run the unit tests after successful deployment.
B.
Configure Cloud Build to deploy the Cloud Function, using the specific service account as the build agent. Run the unit tests after successful deployment.
Answers
C.
Configure Cloud Build to run the unit tests. If the code passes the tests, the developer deploys the Cloud Function.
C.
Configure Cloud Build to run the unit tests. If the code passes the tests, the developer deploys the Cloud Function.
Answers
D.
Configure Cloud Build to run the unit tests, using the specific service account as the build agent. If the code passes the tests, Cloud Build deploys the Cloud Function.
D.
Configure Cloud Build to run the unit tests, using the specific service account as the build agent. If the code passes the tests, Cloud Build deploys the Cloud Function.
Answers
Suggested answer: D

Your team detected a spike of errors in an application running on Cloud Run in your production project. The application is configured to read messages from Pub/Sub topic A, process the messages, and write the messages to topic B. You want to conduct tests to identify the cause of the errors. You can use a set of mock messages for testing. What should you do?

A.
Deploy the Pub/Sub and Cloud Run emulators on your local machine. Deploy the application locally, and change the logging level in the application to DEBUG or INFO. Write mock messages to topic A, and then analyze the logs.
A.
Deploy the Pub/Sub and Cloud Run emulators on your local machine. Deploy the application locally, and change the logging level in the application to DEBUG or INFO. Write mock messages to topic A, and then analyze the logs.
Answers
B.
Use the gcloud CLI to write mock messages to topic A. Change the logging level in the application to DEBUG or INFO, and then analyze the logs.
B.
Use the gcloud CLI to write mock messages to topic A. Change the logging level in the application to DEBUG or INFO, and then analyze the logs.
Answers
C.
Deploy the Pub/Sub emulator on your local machine. Point the production application to your local Pub/Sub topics. Write mock messages to topic A, and then analyze the logs.
C.
Deploy the Pub/Sub emulator on your local machine. Point the production application to your local Pub/Sub topics. Write mock messages to topic A, and then analyze the logs.
Answers
D.
Use the Google Cloud console to write mock messages to topic A. Change the logging level in the application to DEBUG or INFO, and then analyze the logs.
D.
Use the Google Cloud console to write mock messages to topic A. Change the logging level in the application to DEBUG or INFO, and then analyze the logs.
Answers
Suggested answer: B
Total 265 questions
Go to page: of 27