ExamGecko
Home Home / Google / Associate Cloud Engineer

Google Associate Cloud Engineer Practice Test - Questions Answers, Page 21

Question list
Search
Search

List of questions

Search

Related questions











Your company is moving its entire workload to Compute Engine. Some servers should be accessible through the Internet, and other servers should only be accessible over the internal network. All servers need to be able to talk to each other over specific ports and protocols. The current on-premises network relies on a demilitarized zone (DMZ) for the public servers and a Local Area Network (LAN) for the private servers. You need to design the networking infrastructure on

Google Cloud to match these requirements. What should you do?

A.
1. Create a single VPC with a subnet for the DMZ and a subnet for the LAN. 2. Set up firewall rules to open up relevant traffic between the DMZ and the LAN subnets, and another firewall rule to allow public ingress traffic for the DMZ.
A.
1. Create a single VPC with a subnet for the DMZ and a subnet for the LAN. 2. Set up firewall rules to open up relevant traffic between the DMZ and the LAN subnets, and another firewall rule to allow public ingress traffic for the DMZ.
Answers
B.
1. Create a single VPC with a subnet for the DMZ and a subnet for the LAN. 2. Set up firewall rules to open up relevant traffic between the DMZ and the LAN subnets, and another firewall rule to allow public egress traffic for the DMZ.
B.
1. Create a single VPC with a subnet for the DMZ and a subnet for the LAN. 2. Set up firewall rules to open up relevant traffic between the DMZ and the LAN subnets, and another firewall rule to allow public egress traffic for the DMZ.
Answers
C.
1. Create a VPC with a subnet for the DMZ and another VPC with a subnet for the LAN. 2. Set up firewall rules to open up relevant traffic between the DMZ and the LAN subnets, and another firewall rule to allow public ingress traffic for the DMZ.
C.
1. Create a VPC with a subnet for the DMZ and another VPC with a subnet for the LAN. 2. Set up firewall rules to open up relevant traffic between the DMZ and the LAN subnets, and another firewall rule to allow public ingress traffic for the DMZ.
Answers
D.
1. Create a VPC with a subnet for the DMZ and another VPC with a subnet for the LAN. 2. Set up firewall rules to open up relevant traffic between the DMZ and the LAN subnets, and another firewall rule to allow public egress traffic for the DMZ.
D.
1. Create a VPC with a subnet for the DMZ and another VPC with a subnet for the LAN. 2. Set up firewall rules to open up relevant traffic between the DMZ and the LAN subnets, and another firewall rule to allow public egress traffic for the DMZ.
Answers
Suggested answer: C

Explanation:

https://cloud.google.com/vpc/docs/vpc-peering

You have created a new project in Google Cloud through the gcloud command line interface (CLI) and linked a billing account. You need to create a new Compute

Engine instance using the CLI. You need to perform the prerequisite steps. What should you do?

A.
Create a Cloud Monitoring Workspace.
A.
Create a Cloud Monitoring Workspace.
Answers
B.
Create a VPC network in the project.
B.
Create a VPC network in the project.
Answers
C.
Enable the compute googleapis.com API.
C.
Enable the compute googleapis.com API.
Answers
D.
Grant yourself the IAM role of Computer Admin.
D.
Grant yourself the IAM role of Computer Admin.
Answers
Suggested answer: D

Your company has developed a new application that consists of multiple microservices. You want to deploy the application to Google Kubernetes Engine (GKE), and you want to ensure that the cluster can scale as more applications are deployed in the future. You want to avoid manual intervention when each new application is deployed. What should you do?

A.
Deploy the application on GKE, and add a HorizontalPodAutoscaler to the deployment.
A.
Deploy the application on GKE, and add a HorizontalPodAutoscaler to the deployment.
Answers
B.
Deploy the application on GKE, and add a VerticalPodAutoscaler to the deployment.
B.
Deploy the application on GKE, and add a VerticalPodAutoscaler to the deployment.
Answers
C.
Create a GKE cluster with autoscaling enabled on the node pool. Set a minimum and maximum for the size of the node pool.
C.
Create a GKE cluster with autoscaling enabled on the node pool. Set a minimum and maximum for the size of the node pool.
Answers
D.
Create a separate node pool for each application, and deploy each application to its dedicated node pool.
D.
Create a separate node pool for each application, and deploy each application to its dedicated node pool.
Answers
Suggested answer: C

Explanation:

https://cloud.google.com/kubernetes-engine/docs/how-to/cluster-autoscaler#adding_a_node_pool_with_autoscaling

Your coworker has helped you set up several configurations for gcloud. You've noticed that you're running commands against the wrong project. Being new to the company, you haven't yet memorized any of the projects. With the fewest steps possible, what's the fastest way to switch to the correct configuration?

A.
Run gcloud configurations list followed by gcloud configurations activate .
A.
Run gcloud configurations list followed by gcloud configurations activate .
Answers
B.
Run gcloud config list followed by gcloud config activate.
B.
Run gcloud config list followed by gcloud config activate.
Answers
C.
Run gcloud config configurations list followed by gcloud config configurations activate.
C.
Run gcloud config configurations list followed by gcloud config configurations activate.
Answers
D.
Re-authenticate with the gcloud auth login command and select the correct configurations on login.
D.
Re-authenticate with the gcloud auth login command and select the correct configurations on login.
Answers
Suggested answer: C

Explanation:

as gcloud config configurations list can help check for the existing configurations and activate can help switch to the configuration.

gcloud config configurations list lists existing named configurations

gcloud config configurations activate activates an existing named configuration

Obtains access credentials for your user account via a web-based authorization flow. When this command completes successfully, it sets the active account in the current configuration to the account specified. If no configuration exists, it creates a configuration named default.

The storage costs for your application logs have far exceeded the project budget. The logs are currently being retained indefinitely in the Cloud Storage bucket myapp-gcp-ace-logs. You have been asked to remove logs older than 90 days from your Cloud Storage bucket. You want to optimize ongoing Cloud Storage spend. What should you do?

A.
Write a script that runs gsutil Is -| -- gs://myapp-gcp-ace-logs/** to find and remove items older than 90 days. Schedule the script with cron.
A.
Write a script that runs gsutil Is -| -- gs://myapp-gcp-ace-logs/** to find and remove items older than 90 days. Schedule the script with cron.
Answers
B.
Write a lifecycle management rule in JSON and push it to the bucket with gsutil lifecycle set config-json-file.
B.
Write a lifecycle management rule in JSON and push it to the bucket with gsutil lifecycle set config-json-file.
Answers
C.
Write a lifecycle management rule in XML and push it to the bucket with gsutil lifecycle set config-xml-file.
C.
Write a lifecycle management rule in XML and push it to the bucket with gsutil lifecycle set config-xml-file.
Answers
D.
Write a script that runs gsutil Is -Ir gs://myapp-gcp-ace-logs/** to find and remove items older than 90 days. Repeat this process every morning.
D.
Write a script that runs gsutil Is -Ir gs://myapp-gcp-ace-logs/** to find and remove items older than 90 days. Repeat this process every morning.
Answers
Suggested answer: B

Explanation:

You write a lifecycle management rule in XML and push it to the bucket with gsutil lifecycle set config-xml-file. is not right.

gsutil lifecycle set enables you to set the lifecycle configuration on one or more buckets based on the configuration file provided. However, XML is not a valid supported type for the configuration file.

Ref:https://cloud.google.com/storage/docs/gsutil/commands/lifecycle

Write a script that runsgsutil ls -lr gs://myapp-gcp-ace-logs/**to find and remove items older than 90 days. Repeat this process every morning. is not right.

This manual approach is error-prone, time-consuming and expensive. GCP Cloud Storage provides lifecycle management rules that let you achieve this with minimal effort.

Write a script that runsgsutil ls -l gs://myapp-gcp-ace-logs/**to find and remove items older than 90 days. Schedule the script with cron. is not right.

This manual approach is error-prone, time-consuming and expensive. GCP Cloud Storage provides lifecycle management rules that let you achieve this with minimal effort.

Write a lifecycle management rule in JSON and push it to the bucket with gsutil lifecycle set config-json-file. is the right answer.

You can assign a lifecycle management configuration to a bucket. The configuration contains a set of rules which apply to current and future objects in the bucket. When an object meets the criteria of one of the rules, Cloud Storage automatically performs a specified action on the object. One of the supported actions is to Delete objects. You can set up a lifecycle management to delete objects older than 90 days. gsutil lifecycle set enables you to set the lifecycle configuration on the bucket based on the configuration file. JSON is the only supported type for the configuration file. The config-json-file specified on the command line should be a path to a local file containing the lifecycle configuration JSON document.

Ref:https://cloud.google.com/storage/docs/gsutil/commands/lifecycle

Ref:https://cloud.google.com/storage/docs/lifecycle

Users of your application are complaining of slowness when loading the application. You realize the slowness is because the App Engine deployment serving the application is deployed in us-central whereas all users of this application are closest to europe-west3. You want to change the region of the App Engine application to europe-west3 to minimize latency. What's the best way to change the App Engine region?

A.
Create a new project and create an App Engine instance in europe-west3
A.
Create a new project and create an App Engine instance in europe-west3
Answers
B.
Use the gcloud app region set command and supply the name of the new region.
B.
Use the gcloud app region set command and supply the name of the new region.
Answers
C.
From the console, under the App Engine page, click edit, and change the region drop-down.
C.
From the console, under the App Engine page, click edit, and change the region drop-down.
Answers
D.
Contact Google Cloud Support and request the change.
D.
Contact Google Cloud Support and request the change.
Answers
Suggested answer: A

Explanation:

App engine is a regional service, which means the infrastructure that runs your app(s) is located in a specific region and is managed by Google to be redundantly available across all the zones within that region. Once an app engine deployment is created in a region, it cant be changed. The only way is to create a new project and create an App Engine instance in europe-west3, send all user traffic to this instance and delete the app engine instance in us-central.

Ref:https://cloud.google.com/appengine/docs/locations

A company wants to build an application that stores images in a Cloud Storage bucket and wants to generate thumbnails as well as resize the images. They want to use a google managed service that can scale up and scale down to zero automatically with minimal effort. You have been asked to recommend a service. Which GCP service would you suggest?

A.
Google Compute Engine
A.
Google Compute Engine
Answers
B.
Google App Engine
B.
Google App Engine
Answers
C.
Cloud Functions
C.
Cloud Functions
Answers
D.
Google Kubernetes Engine
D.
Google Kubernetes Engine
Answers
Suggested answer: C

Explanation:

Cloud Functions is Google Cloud's event-driven serverless compute platform. It automatically scales based on the load and requires no additional configuration. You pay only for the resources used.

Ref:https://cloud.google.com/functions

While all other options i.e. Google Compute Engine, Google Kubernetes Engine, Google App Engine support autoscaling, it needs to be configured explicitly based on the load and is not as trivial as the scale up or scale down offered by Google's cloud functions.

You are designing an application that lets users upload and share photos. You expect your application to grow really fast and you are targeting a worldwide audience. You want to delete uploaded photos after 30 days. You want to minimize costs while ensuring your application is highly available. Which GCP storage solution should you choose?

A.
Persistent SSD on VM instances.
A.
Persistent SSD on VM instances.
Answers
B.
Cloud Filestore.
B.
Cloud Filestore.
Answers
C.
Multiregional Cloud Storage bucket.
C.
Multiregional Cloud Storage bucket.
Answers
D.
Cloud Datastore database.
D.
Cloud Datastore database.
Answers
Suggested answer: C

Explanation:

Cloud Storage allows world-wide storage and retrieval of any amount of data at any time. We dont need to set up auto-scaling ourselves. Cloud Storage autoscaling is managed by GCP. Cloud Storage is an object store so it is suitable for storing photos. Cloud Storage allows world-wide storage and retrieval so cater well to our worldwide audience. Cloud storage provides us lifecycle rules that can be configured to automatically delete objects older than 30 days. This also fits our requirements. Finally, Google Cloud Storage offers several storage classes such as Nearline Storage ($0.01 per GB per Month) Coldline Storage ($0.007 per GB per Month) and Archive Storage ($0.004 per GB per month) which are significantly cheaper than any of the options above.

Ref:https://cloud.google.com/storage/docs

Ref:https://cloud.google.com/storage/pricing

You are designing an application that uses WebSockets and HTTP sessions that are not distributed across the web servers. You want to ensure the application runs properly on Google Cloud Platform. What should you do?

A.
Meet with the cloud enablement team to discuss load balancer options.
A.
Meet with the cloud enablement team to discuss load balancer options.
Answers
B.
Redesign the application to use a distributed user session service that does not rely on WebSockets and HTTP sessions.
B.
Redesign the application to use a distributed user session service that does not rely on WebSockets and HTTP sessions.
Answers
C.
Review the encryption requirements for WebSocket connections with the security team.
C.
Review the encryption requirements for WebSocket connections with the security team.
Answers
D.
Convert the WebSocket code to use HTTP streaming.
D.
Convert the WebSocket code to use HTTP streaming.
Answers
Suggested answer: A

Explanation:

Google HTTP(S) Load Balancing has native support for the WebSocket protocol when you use HTTP or HTTPS, not HTTP/2, as the protocol to the backend.

Ref:https://cloud.google.com/load-balancing/docs/https#websocket_proxy_support

So the next possible step is to Meet with the cloud enablement team to discuss load balancer options.

We dont need to convert WebSocket code to use HTTP streaming or Redesign the application, as WebSocket support is offered by Google HTTP(S) Load Balancing. Reviewing the encryption requirements is a good idea but it has nothing to do with WebSockets.

You have a number of compute instances belonging to an unmanaged instances group. You need to SSH to one of the Compute Engine instances to run an ad hoc script. You've already authenticated gcloud, however, you don't have an SSH key deployed yet. In the fewest steps possible, what's the easiest way to SSH to the instance?

A.
Run gcloud compute instances list to get the IP address of the instance, then use the ssh command.
A.
Run gcloud compute instances list to get the IP address of the instance, then use the ssh command.
Answers
B.
Use the gcloud compute ssh command.
B.
Use the gcloud compute ssh command.
Answers
C.
Create a key with the ssh-keygen command. Then use the gcloud compute ssh command.
C.
Create a key with the ssh-keygen command. Then use the gcloud compute ssh command.
Answers
D.
Create a key with the ssh-keygen command. Upload the key to the instance. Run gcloud compute instances list to get the IP address of the instance, then use the ssh command.
D.
Create a key with the ssh-keygen command. Upload the key to the instance. Run gcloud compute instances list to get the IP address of the instance, then use the ssh command.
Answers
Suggested answer: B

Explanation:

gcloud compute ssh ensures that the user's public SSH key is present in the project's metadata. If the user does not have a public SSH key, one is generated using ssh-keygen and added to the project's metadata. This is similar to the other option where we copy the key explicitly to the project's metadata but here it is done automatically for us. There are also security benefits with this approach. When we use gcloud compute ssh to connect to Linux instances, we are adding a layer of security by storing your host keys as guest attributes. Storing SSH host keys as guest attributes improve the security of your connections by helping to protect against vulnerabilities such as man-in-the-middle (MITM) attacks. On the initial boot of a VM instance, if guest attributes are enabled, Compute Engine stores your generated host keys as guest attributes.

Compute Engine then uses these host keys that were stored during the initial boot to verify all subsequent connections to the VM instance.

Ref:https://cloud.google.com/compute/docs/instances/connecting-to-instance Ref:https://cloud.google.com/sdk/gcloud/reference/compute/ssh

Total 289 questions
Go to page: of 29