ExamGecko
Home Home / Google / Professional Cloud Developer

Google Professional Cloud Developer Practice Test - Questions Answers, Page 16

Question list
Search
Search

List of questions

Search

Related questions











You have been tasked with planning the migration of your company's application from on-premises to Google Cloud. Your company's monolithic application is an ecommerce website. The application will be migrated to microservices deployed on Google Cloud in stages. The majority of your company's revenue is generated through online sales, so it is important to minimize risk during the migration. You need to prioritize features and select the first functionality to migrate. What should you do?

A.
Migrate the Product catalog, which has integrations to the frontend and product database.
A.
Migrate the Product catalog, which has integrations to the frontend and product database.
Answers
B.
Migrate Payment processing, which has integrations to the frontend, order database, and third-party payment vendor.
B.
Migrate Payment processing, which has integrations to the frontend, order database, and third-party payment vendor.
Answers
C.
Migrate Order fulfillment, which has integrations to the order database, inventory system, and third-party shipping vendor.
C.
Migrate Order fulfillment, which has integrations to the order database, inventory system, and third-party shipping vendor.
Answers
D.
Migrate the Shopping cart, which has integrations to the frontend, cart database, inventory system, and payment processing system.
D.
Migrate the Shopping cart, which has integrations to the frontend, cart database, inventory system, and payment processing system.
Answers
Suggested answer: A

Your team develops services that run on Google Kubernetes Engine. Your team's code is stored in Cloud Source Repositories. You need to quickly identify bugs in the code before it is deployed to production. You want to invest in automation to improve developer feedback and make the process as efficient as possible. What should you do?

A.
Use Spinnaker to automate building container images from code based on Git tags.
A.
Use Spinnaker to automate building container images from code based on Git tags.
Answers
B.
Use Cloud Build to automate building container images from code based on Git tags.
B.
Use Cloud Build to automate building container images from code based on Git tags.
Answers
C.
Use Spinnaker to automate deploying container images to the production environment.
C.
Use Spinnaker to automate deploying container images to the production environment.
Answers
D.
Use Cloud Build to automate building container images from code based on forked versions.
D.
Use Cloud Build to automate building container images from code based on forked versions.
Answers
Suggested answer: A

You developed a JavaScript web application that needs to access Google Drive's API and obtain permission from users to store files in their Google Drives. You need to select an authorization approach for your application. What should you do?

A.
Create an API key.
A.
Create an API key.
Answers
B.
Create a SAML token.
B.
Create a SAML token.
Answers
C.
Create a service account.
C.
Create a service account.
Answers
D.
Create an OAuth Client ID.
D.
Create an OAuth Client ID.
Answers
Suggested answer: D

You manage an ecommerce application that processes purchases from customers who can subsequently cancel or change those purchases. You discover that order volumes are highly variable and the backend order-processing system can only process one request at a time. You want to ensure seamless performance for customers regardless of usage volume. It is crucial that customers' order update requests are performed in the sequence in which they were generated. What should you do?

A.
Send the purchase and change requests over WebSockets to the backend.
A.
Send the purchase and change requests over WebSockets to the backend.
Answers
B.
Send the purchase and change requests as REST requests to the backend.
B.
Send the purchase and change requests as REST requests to the backend.
Answers
C.
Use a Pub/Sub subscriber in pull mode and use a data store to manage ordering.
C.
Use a Pub/Sub subscriber in pull mode and use a data store to manage ordering.
Answers
D.
Use a Pub/Sub subscriber in push mode and use a data store to manage ordering.
D.
Use a Pub/Sub subscriber in push mode and use a data store to manage ordering.
Answers
Suggested answer: C

Explanation:

https://cloud.google.com/pubsub/docs/pull

Your company needs a database solution that stores customer purchase history and meets the following requirements:

Customers can query their purchase immediately after submission.

Purchases can be sorted on a variety of fields.

Distinct record formats can be stored at the same time.

Which storage option satisfies these requirements?

A.
Firestore in Native mode
A.
Firestore in Native mode
Answers
B.
Cloud Storage using an object read
B.
Cloud Storage using an object read
Answers
C.
Cloud SQL using a SQL SELECT statement
C.
Cloud SQL using a SQL SELECT statement
Answers
D.
Firestore in Datastore mode using a global query
D.
Firestore in Datastore mode using a global query
Answers
Suggested answer: A

You recently developed a new service on Cloud Run. The new service authenticates using a custom service and then writes transactional information to a Cloud Spanner database. You need to verify that your application can support up to 5,000 read and 1,000 write transactions persecond while identifying any bottlenecks that occur. Your test infrastructure must be able to autoscale. What should you do?

A.
Build a test harness to generate requests and deploy it to Cloud Run. Analyze the VPC Flow Logs using Cloud Logging.
A.
Build a test harness to generate requests and deploy it to Cloud Run. Analyze the VPC Flow Logs using Cloud Logging.
Answers
B.
Create a Google Kubernetes Engine cluster running the Locust or JMeter images to dynamically generate load tests. Analyze the results using Cloud Trace.
B.
Create a Google Kubernetes Engine cluster running the Locust or JMeter images to dynamically generate load tests. Analyze the results using Cloud Trace.
Answers
C.
Create a Cloud Task to generate a test load. Use Cloud Scheduler to run 60,000 Cloud Task transactions perminute for 10minutes. Analyze the results using Cloud Monitoring.
C.
Create a Cloud Task to generate a test load. Use Cloud Scheduler to run 60,000 Cloud Task transactions perminute for 10minutes. Analyze the results using Cloud Monitoring.
Answers
D.
Create a Compute Engine instance that uses a LAMP stack image from the Marketplace, and use Apache Bench to generate load tests against the service. Analyze the results using Cloud Trace.
D.
Create a Compute Engine instance that uses a LAMP stack image from the Marketplace, and use Apache Bench to generate load tests against the service. Analyze the results using Cloud Trace.
Answers
Suggested answer: B

Explanation:

https://cloud.google.com/architecture/distributed-load-testing-using-gke

You are using Cloud Build for your CI/CD pipeline to complete several tasks, including copying certain files to Compute Engine virtual machines. Your pipeline requires a flat file that is generated in one builder in the pipeline to be accessible by subsequent builders in the same pipeline. How should you store the file so that all the builders in the pipeline can access it?

A.
Store and retrieve the file contents using Compute Engine instance metadata.
A.
Store and retrieve the file contents using Compute Engine instance metadata.
Answers
B.
Output the file contents to a file in /workspace. Read from the same /workspace file in the subsequent build step.
B.
Output the file contents to a file in /workspace. Read from the same /workspace file in the subsequent build step.
Answers
C.
Use gsutil to output the file contents to a Cloud Storage object. Read from the same object in the subsequent build step.
C.
Use gsutil to output the file contents to a Cloud Storage object. Read from the same object in the subsequent build step.
Answers
D.
Add a build argument that runs an HTTP POST via curl to a separate web server to persist the value in one builder. Use an HTTP GET via curl from the subsequent build step to read the value.
D.
Add a build argument that runs an HTTP POST via curl to a separate web server to persist the value in one builder. Use an HTTP GET via curl from the subsequent build step to read the value.
Answers
Suggested answer: B

Explanation:

https://cloud.google.com/build/docs/build-config-file-schema

Your company's development teams want to use various open source operating systems in their Docker builds. When images are created in published containers in your company's environment, you need to scan them for Common Vulnerabilities and Exposures (CVEs). The scanning process must not impact software development agility. You want to use managed services where possible. What should you do?

A.
Enable the Vulnerability scanning setting in the Container Registry.
A.
Enable the Vulnerability scanning setting in the Container Registry.
Answers
B.
Create a Cloud Function that is triggered on a code check-in and scan the code for CVEs.
B.
Create a Cloud Function that is triggered on a code check-in and scan the code for CVEs.
Answers
C.
Disallow the use of non-commercially supported base images in your development environment.
C.
Disallow the use of non-commercially supported base images in your development environment.
Answers
D.
Use Cloud Monitoring to review the output of Cloud Build to determine whether a vulnerable version has been used.
D.
Use Cloud Monitoring to review the output of Cloud Build to determine whether a vulnerable version has been used.
Answers
Suggested answer: A

Explanation:

https://cloud.google.com/container-analysis/docs/os-overview

You are configuring a continuous integration pipeline using Cloud Build to automate the deployment of new container images to Google Kubernetes Engine (GKE). The pipeline builds the application from its source code, runs unit and integration tests in separate steps, and pushes the container to Container Registry. The application runs on a Python web server.

The Dockerfile is as follows:

FROM python:3.7-alpine -

COPY . /app -

WORKDIR /app -

RUN pip install -r requirements.txt

CMD [ 'gunicorn', '-w 4', 'main:app' ]

You notice that Cloud Build runs are taking longer than expected to complete. You want to decrease the build time. What should you do? (Choose two.)

A.
Select a virtual machine (VM) size with higher CPU for Cloud Build runs.
A.
Select a virtual machine (VM) size with higher CPU for Cloud Build runs.
Answers
B.
Deploy a Container Registry on a Compute Engine VM in a VPC, and use it to store the final images.
B.
Deploy a Container Registry on a Compute Engine VM in a VPC, and use it to store the final images.
Answers
C.
Cache the Docker image for subsequent builds using the -- cache-from argument in your build config file.
C.
Cache the Docker image for subsequent builds using the -- cache-from argument in your build config file.
Answers
D.
Change the base image in the Dockerfile to ubuntu:latest, and install Python 3.7 using a package manager utility.
D.
Change the base image in the Dockerfile to ubuntu:latest, and install Python 3.7 using a package manager utility.
Answers
E.
Store application source code on Cloud Storage, and configure the pipeline to use gsutil to download the source code.
E.
Store application source code on Cloud Storage, and configure the pipeline to use gsutil to download the source code.
Answers
Suggested answer: A, C

Explanation:

https://cloud.google.com/build/docs/optimize-builds/increase-vcpu-for-builds

By default, Cloud Build runs your builds on a standard virtual machine (VM). In addition to the standard VM, Cloud Build provides several high-CPU VM types to run builds. To increase the speed of your build, select a machine with a higher vCPU to run builds. Keep in mind that although selecting a high vCPU machine increases your build speed, it may also increase the startup time of your build as Cloud Build only starts non-standard machines on demand.

https://cloud.google.com/build/docs/optimize-builds/speeding-up-builds#using_a_cached_docker_image

The easiest way to increase the speed of your Docker image build is by specifying a cached image that can be used for subsequent builds. You can specify the cached image by adding the --cache-from argument in your build config file, which will instruct Docker to build using that image as a cache source.

You are building a CI/CD pipeline that consists of a version control system, Cloud Build, and Container Registry. Each time a new tag is pushed to the repository, a Cloud Build job is triggered, which runs unit tests on the new code builds a new Docker container image, and pushes it into Container Registry. The last step of your pipeline should deploy the new container to your production Google Kubernetes Engine (GKE) cluster. You need to select a tool and deployment strategy that meets the following requirements:

* Zero downtime is incurred

* Testing is fully automated

* Allows for testing before being rolled out to users

* Can quickly rollback if needed

What should you do?

A.
Trigger a Spinnaker pipeline configured as an A/B test of your new code and, if it is successful, deploy the container to production.
A.
Trigger a Spinnaker pipeline configured as an A/B test of your new code and, if it is successful, deploy the container to production.
Answers
B.
Trigger a Spinnaker pipeline configured as a canary test of your new code and, if it is successful, deploy the container to production.
B.
Trigger a Spinnaker pipeline configured as a canary test of your new code and, if it is successful, deploy the container to production.
Answers
C.
Trigger another Cloud Build job that uses the Kubernetes CLI tools to deploy your new container to your GKE cluster, where you can perform a canary test.
C.
Trigger another Cloud Build job that uses the Kubernetes CLI tools to deploy your new container to your GKE cluster, where you can perform a canary test.
Answers
D.
Trigger another Cloud Build job that uses the Kubernetes CLI tools to deploy your new container to your GKE cluster, where you can perform a shadow test.
D.
Trigger another Cloud Build job that uses the Kubernetes CLI tools to deploy your new container to your GKE cluster, where you can perform a shadow test.
Answers
Suggested answer: D

Explanation:

https://cloud.google.com/architecture/implementing-deployment-and-testing-strategies-on-gke#perform_a_shadow_test With a shadow test, you test the new version of your application by mirroring user traffic from the current application version without impacting the user requests.

Total 265 questions
Go to page: of 27