ExamGecko
Home Home / Google / Professional Cloud Developer

Google Professional Cloud Developer Practice Test - Questions Answers, Page 20

Question list
Search
Search

List of questions

Search

Related questions











You are designing an application that consists of several microservices. Each microservice has its own RESTful API and will be deployed as a separate Kubernetes Service. You want to ensure that the consumers of these APIs aren't impacted when there is a change to your API, and also ensure that third-party systems aren't interrupted when new versions of the API are released. How should you configure the connection to the application following Google-recommended best practices?

A.
Use an Ingress that uses the API's URL to route requests to the appropriate backend.
A.
Use an Ingress that uses the API's URL to route requests to the appropriate backend.
Answers
B.
Leverage a Service Discovery system, and connect to the backend specified by the request.
B.
Leverage a Service Discovery system, and connect to the backend specified by the request.
Answers
C.
Use multiple clusters, and use DNS entries to route requests to separate versioned backends.
C.
Use multiple clusters, and use DNS entries to route requests to separate versioned backends.
Answers
D.
Combine multiple versions in the same service, and then specify the API version in the POST request.
D.
Combine multiple versions in the same service, and then specify the API version in the POST request.
Answers
Suggested answer: D

Your team is building an application for a financial institution. The application's frontend runs on Compute Engine, and the data resides in Cloud SQL and one Cloud Storage bucket. The application will collect data containing PII, which will be stored in the Cloud SQL database and the Cloud Storage bucket. You need to secure the PII data. What should you do?

A.
1) Create the relevant firewall rules to allow only the frontend to communicate with the Cloud SQL database 2) Using IAM, allow only the frontend service account to access the Cloud Storage bucket
A.
1) Create the relevant firewall rules to allow only the frontend to communicate with the Cloud SQL database 2) Using IAM, allow only the frontend service account to access the Cloud Storage bucket
Answers
B.
1) Create the relevant firewall rules to allow only the frontend to communicate with the Cloud SQL database 2) Enable private access to allow the frontend to access the Cloud Storage bucket privately
B.
1) Create the relevant firewall rules to allow only the frontend to communicate with the Cloud SQL database 2) Enable private access to allow the frontend to access the Cloud Storage bucket privately
Answers
C.
1) Configure a private IP address for Cloud SQL 2) Use VPC-SC to create a service perimeter 3) Add the Cloud SQL database and the Cloud Storage bucket to the same service perimeter
C.
1) Configure a private IP address for Cloud SQL 2) Use VPC-SC to create a service perimeter 3) Add the Cloud SQL database and the Cloud Storage bucket to the same service perimeter
Answers
D.
1) Configure a private IP address for Cloud SQL 2) Use VPC-SC to create a service perimeter 3) Add the Cloud SQL database and the Cloud Storage bucket to different service perimeters
D.
1) Configure a private IP address for Cloud SQL 2) Use VPC-SC to create a service perimeter 3) Add the Cloud SQL database and the Cloud Storage bucket to different service perimeters
Answers
Suggested answer: C

You are designing a chat room application that will host multiple rooms and retain the message history for each room. You have selected Firestore as your database. How should you represent the data in Firestore?

A.
Create a collection for the rooms. For each room, create a document that lists the contents of the messages
A.
Create a collection for the rooms. For each room, create a document that lists the contents of the messages
Answers
B.
Create a collection for the rooms. For each room, create a collection that contains a document for each message
B.
Create a collection for the rooms. For each room, create a collection that contains a document for each message
Answers
C.
Create a collection for the rooms. For each room, create a document that contains a collection for documents, each of which contains a message.
C.
Create a collection for the rooms. For each room, create a document that contains a collection for documents, each of which contains a message.
Answers
D.
Create a collection for the rooms, and create a document for each room. Create a separate collection for messages, with one document per message. Each room's document contains a list of references to the messages.
D.
Create a collection for the rooms, and create a document for each room. Create a separate collection for messages, with one document per message. Each room's document contains a list of references to the messages.
Answers
Suggested answer: C

Explanation:

https://firebase.google.com/docs/firestore/data-model#hierarchical-data

You are developing an application that will handle requests from end users. You need to secure a Cloud Function called by the application to allow authorized end users to authenticate to the function via the application while restricting access to unauthorized users. You will integrate Google Sign-In as part of the solution and want to follow Google-recommended best practices. What should you do?

A.
Deploy from a source code repository and grant users the roles/cloudfunctions.viewer role.
A.
Deploy from a source code repository and grant users the roles/cloudfunctions.viewer role.
Answers
B.
Deploy from a source code repository and grant users the roles/cloudfunctions.invoker role
B.
Deploy from a source code repository and grant users the roles/cloudfunctions.invoker role
Answers
C.
Deploy from your local machine using gcloud and grant users the roles/cloudfunctions.admin role
C.
Deploy from your local machine using gcloud and grant users the roles/cloudfunctions.admin role
Answers
D.
Deploy from your local machine using gcloud and grant users the roles/cloudfunctions.developer role
D.
Deploy from your local machine using gcloud and grant users the roles/cloudfunctions.developer role
Answers
Suggested answer: C

You are running a web application on Google Kubernetes Engine that you inherited. You want to determine whether the application is using libraries with known vulnerabilities or is vulnerable to XSS attacks. Which service should you use?

A.
Google Cloud Armor
A.
Google Cloud Armor
Answers
B.
Debugger
B.
Debugger
Answers
C.
Web Security Scanner
C.
Web Security Scanner
Answers
D.
Error Reporting
D.
Error Reporting
Answers
Suggested answer: C

Explanation:

https://cloud.google.com/security-command-center/docs/concepts-web-security-scanner-overview

Web Security Scanner identifies security vulnerabilities in your App Engine, Google Kubernetes Engine (GKE), and Compute Engine web applications. It crawls your application, following all links within the scope of your starting URLs, and attempts to exercise as many user inputs and event handlers as possible.

You are building a highly available and globally accessible application that will serve static content to users. You need to configure the storage and serving components. You want to minimize management overhead and latency while maximizing reliability for users. What should you do?

A.
1) Create a managed instance group. Replicate the static content across the virtual machines (VMs) 2) Create an external HTTP(S) load balancer. 3) Enable Cloud CDN, and send traffic to the managed instance group.
A.
1) Create a managed instance group. Replicate the static content across the virtual machines (VMs) 2) Create an external HTTP(S) load balancer. 3) Enable Cloud CDN, and send traffic to the managed instance group.
Answers
B.
1) Create an unmanaged instance group. Replicate the static content across the VMs. 2) Create an external HTTP(S) load balancer 3) Enable Cloud CDN, and send traffic to the unmanaged instance group.
B.
1) Create an unmanaged instance group. Replicate the static content across the VMs. 2) Create an external HTTP(S) load balancer 3) Enable Cloud CDN, and send traffic to the unmanaged instance group.
Answers
C.
1) Create a Standard storage class, regional Cloud Storage bucket. Put the static content in the bucket 2) Reserve an external IP address, and create an external HTTP(S) load balancer 3) Enable Cloud CDN, and send traffic to your backend bucket
C.
1) Create a Standard storage class, regional Cloud Storage bucket. Put the static content in the bucket 2) Reserve an external IP address, and create an external HTTP(S) load balancer 3) Enable Cloud CDN, and send traffic to your backend bucket
Answers
D.
1) Create a Standard storage class, multi-regional Cloud Storage bucket. Put the static content in the bucket. 2) Reserve an external IP address, and create an external HTTP(S) load balancer. 3) Enable Cloud CDN, and send traffic to your backend bucket.
D.
1) Create a Standard storage class, multi-regional Cloud Storage bucket. Put the static content in the bucket. 2) Reserve an external IP address, and create an external HTTP(S) load balancer. 3) Enable Cloud CDN, and send traffic to your backend bucket.
Answers
Suggested answer: D

You are writing from a Go application to a Cloud Spanner database. You want to optimize your application's performance using Google-recommended best practices. What should you do?

A.
Write to Cloud Spanner using Cloud Client Libraries.
A.
Write to Cloud Spanner using Cloud Client Libraries.
Answers
B.
Write to Cloud Spanner using Google API Client Libraries
B.
Write to Cloud Spanner using Google API Client Libraries
Answers
C.
Write to Cloud Spanner using a custom gRPC client library.
C.
Write to Cloud Spanner using a custom gRPC client library.
Answers
D.
Write to Cloud Spanner using a third-party HTTP client library.
D.
Write to Cloud Spanner using a third-party HTTP client library.
Answers
Suggested answer: A

Explanation:

https://cloud.google.com/apis/docs/cloud-client-libraries

''Cloud Client Libraries are the recommended option for accessing Cloud APIs programmatically, where available. Cloud Client Libraries use the latest client library models''

https://cloud.google.com/apis/docs/client-libraries-explained

https://cloud.google.com/go/docs/reference

You have an application deployed in Google Kubernetes Engine (GKE). You need to update the application to make authorized requests to Google Cloud managed services. You want this to be a one-time setup, and you need to follow security best practices of auto-rotating your security keys and storing them in an encrypted store. You already created a service account with appropriate access to the Google Cloud service. What should you do next?

A.
Assign the Google Cloud service account to your GKE Pod using Workload Identity.
A.
Assign the Google Cloud service account to your GKE Pod using Workload Identity.
Answers
B.
Export the Google Cloud service account, and share it with the Pod as a Kubernetes Secret.
B.
Export the Google Cloud service account, and share it with the Pod as a Kubernetes Secret.
Answers
C.
Export the Google Cloud service account, and embed it in the source code of the application.
C.
Export the Google Cloud service account, and embed it in the source code of the application.
Answers
D.
Export the Google Cloud service account, and upload it to HashiCorp Vault to generate a dynamic service account for your application.
D.
Export the Google Cloud service account, and upload it to HashiCorp Vault to generate a dynamic service account for your application.
Answers
Suggested answer: A

Explanation:

https://cloud.google.com/kubernetes-engine/docs/concepts/workload-identity

Applications running on GKE might need access to Google Cloud APIs such as Compute Engine API, BigQuery Storage API, or Machine Learning APIs.

Workload Identity allows a Kubernetes service account in your GKE cluster to act as an IAM service account. Pods that use the configured Kubernetes service account automatically authenticate as the IAM service account when accessing Google Cloud APIs. Using Workload Identity allows you to assign distinct, fine-grained identities and authorization for each application in your cluster.

You are planning to deploy hundreds of microservices in your Google Kubernetes Engine (GKE) cluster. How should you secure communication between the microservices on GKE using a managed service?

A.
Use global HTTP(S) Load Balancing with managed SSL certificates to protect your services
A.
Use global HTTP(S) Load Balancing with managed SSL certificates to protect your services
Answers
B.
Deploy open source Istio in your GKE cluster, and enable mTLS in your Service Mesh
B.
Deploy open source Istio in your GKE cluster, and enable mTLS in your Service Mesh
Answers
C.
Install cert-manager on GKE to automatically renew the SSL certificates.
C.
Install cert-manager on GKE to automatically renew the SSL certificates.
Answers
D.
Install Anthos Service Mesh, and enable mTLS in your Service Mesh.
D.
Install Anthos Service Mesh, and enable mTLS in your Service Mesh.
Answers
Suggested answer: D

Explanation:

https://cloud.google.com/service-mesh/docs/overview#security_benefits

- Ensures encryption in transit. Using mTLS for authentication also ensures that all TCP communications are encrypted in transit.

You are developing an application that will store and access sensitive unstructured data objects in a Cloud Storage bucket. To comply with regulatory requirements, you need to ensure that all data objects are available for at least 7 years after their initial creation. Objects created more than 3 years ago are accessed very infrequently (less than once a year). You need to configure object storage while ensuring that storage cost is optimized. What should you do? (Choose two.)

A.
Set a retention policy on the bucket with a period of 7 years.
A.
Set a retention policy on the bucket with a period of 7 years.
Answers
B.
Use IAM Conditions to provide access to objects 7 years after the object creation date.
B.
Use IAM Conditions to provide access to objects 7 years after the object creation date.
Answers
C.
Enable Object Versioning to prevent objects from being accidentally deleted for 7 years after object creation.
C.
Enable Object Versioning to prevent objects from being accidentally deleted for 7 years after object creation.
Answers
D.
Create an object lifecycle policy on the bucket that moves objects from Standard Storage to Archive Storage after 3 years.
D.
Create an object lifecycle policy on the bucket that moves objects from Standard Storage to Archive Storage after 3 years.
Answers
E.
Implement a Cloud Function that checks the age of each object in the bucket and moves the objects older than 3 years to a second bucket with the Archive Storage class. Use Cloud Scheduler to trigger the Cloud Function on a daily schedule.
E.
Implement a Cloud Function that checks the age of each object in the bucket and moves the objects older than 3 years to a second bucket with the Archive Storage class. Use Cloud Scheduler to trigger the Cloud Function on a daily schedule.
Answers
Suggested answer: A, D

Explanation:

https://cloud.google.com/storage/docs/bucket-lock

This page discusses the Bucket Lock feature, which allows you to configure a data retention policy for a Cloud Storage bucket that governs how long objects in the bucket must be retained. The feature also allows you to lock the data retention policy, permanently preventing the policy from being reduced or removed.

https://cloud.google.com/storage/docs/storage-classes#archive

Archive storage is the lowest-cost, highly durable storage service for data archiving, online backup, and disaster recovery. Unlike the 'coldest' storage services offered by other Cloud providers, your data is available within milliseconds, not hours or days.

Archive storage is the best choice for data that you plan to access less than once a year.

Total 265 questions
Go to page: of 27