ExamGecko
Home Home / Google / Associate Cloud Engineer

Google Associate Cloud Engineer Practice Test - Questions Answers, Page 2

Question list
Search
Search

List of questions

Search

Related questions











You need a dynamic way of provisioning VMs on Compute Engine. The exact specifications will be in a dedicated configuration file. You want to follow Google's recommended practices. Which method should you use?

A.
Deployment Manager
A.
Deployment Manager
Answers
B.
Cloud Composer
B.
Cloud Composer
Answers
C.
Managed Instance Group
C.
Managed Instance Group
Answers
D.
Unmanaged Instance Group
D.
Unmanaged Instance Group
Answers
Suggested answer: A

Explanation:

https://cloud.google.com/deployment-manager/docs/configuration/create-basic-configuration

You have a Dockerfile that you need to deploy on Kubernetes Engine. What should you do?

A.
Use kubectl app deploy <dockerfilename>.
A.
Use kubectl app deploy <dockerfilename>.
Answers
B.
Use gcloud app deploy <dockerfilename>.
B.
Use gcloud app deploy <dockerfilename>.
Answers
C.
Create a docker image from the Dockerfile and upload it to Container Registry. Create a Deployment YAML file to point to that image. Use kubectl to create the deployment with that file.
C.
Create a docker image from the Dockerfile and upload it to Container Registry. Create a Deployment YAML file to point to that image. Use kubectl to create the deployment with that file.
Answers
D.
Create a docker image from the Dockerfile and upload it to Cloud Storage. Create a Deployment YAML file to point to that image. Use kubectl to create the deployment with that file.
D.
Create a docker image from the Dockerfile and upload it to Cloud Storage. Create a Deployment YAML file to point to that image. Use kubectl to create the deployment with that file.
Answers
Suggested answer: C

You have a Dockerfile that you need to deploy on Kubernetes Engine. What should you do?

A.
Use kubectl app deploy <dockerfilename>.
A.
Use kubectl app deploy <dockerfilename>.
Answers
B.
Use gcloud app deploy <dockerfilename>.
B.
Use gcloud app deploy <dockerfilename>.
Answers
C.
Create a docker image from the Dockerfile and upload it to Container Registry. Create a Deployment YAML file to point to that image. Use kubectl to create the deployment with that file.
C.
Create a docker image from the Dockerfile and upload it to Container Registry. Create a Deployment YAML file to point to that image. Use kubectl to create the deployment with that file.
Answers
D.
Create a docker image from the Dockerfile and upload it to Cloud Storage. Create a Deployment YAML file to point to that image. Use kubectl to create the deployment with that file.
D.
Create a docker image from the Dockerfile and upload it to Cloud Storage. Create a Deployment YAML file to point to that image. Use kubectl to create the deployment with that file.
Answers
Suggested answer: C

You need to update a deployment in Deployment Manager without any resource downtime in the deployment. Which command should you use?

A.
gcloud deployment-manager deployments create --config <deployment-config-path>
A.
gcloud deployment-manager deployments create --config <deployment-config-path>
Answers
B.
gcloud deployment-manager deployments update --config <deployment-config-path>
B.
gcloud deployment-manager deployments update --config <deployment-config-path>
Answers
C.
gcloud deployment-manager resources create --config <deployment-config-path>
C.
gcloud deployment-manager resources create --config <deployment-config-path>
Answers
D.
gcloud deployment-manager resources update --config <deployment-config-path>
D.
gcloud deployment-manager resources update --config <deployment-config-path>
Answers
Suggested answer: B

You need to run an important query in BigQuery but expect it to return a lot of records. You want to find out how much it will cost to run the query. You are using on-demand pricing. What should you do?

A.
Arrange to switch to Flat-Rate pricing for this query, then move back to on-demand.
A.
Arrange to switch to Flat-Rate pricing for this query, then move back to on-demand.
Answers
B.
Use the command line to run a dry run query to estimate the number of bytes read. Then convert that bytes estimate to dollars using the Pricing Calculator.
B.
Use the command line to run a dry run query to estimate the number of bytes read. Then convert that bytes estimate to dollars using the Pricing Calculator.
Answers
C.
Use the command line to run a dry run query to estimate the number of bytes returned. Then convert that bytes estimate to dollars using the Pricing Calculator.
C.
Use the command line to run a dry run query to estimate the number of bytes returned. Then convert that bytes estimate to dollars using the Pricing Calculator.
Answers
D.
Run a select count (*) to get an idea of how many records your query will look through. Then convert that number of rows to dollars using the Pricing Calculator.
D.
Run a select count (*) to get an idea of how many records your query will look through. Then convert that number of rows to dollars using the Pricing Calculator.
Answers
Suggested answer: B

Explanation:

On-demand pricing Under on-demand pricing, BigQuery charges for queries by using one metric: the number of bytes processed (also referred to as bytes read). You are charged for the number of bytes processed whether the data is stored in BigQuery or in an external data source such as Cloud Storage, Drive, or Cloud Bigtable. On-demand pricing is based solely on usage. https://cloud.google.com/bigquery/pricing#on_demand_pricing

You have a single binary application that you want to run on Google Cloud Platform. You decided to automatically scale the application based on underlying infrastructure CPU usage. Your organizational policies require you to use virtual machines directly. You need to ensure that the application scaling is operationally efficient and completed as quickly as possible. What should you do?

A.
Create a Google Kubernetes Engine cluster, and use horizontal pod autoscaling to scale the application.
A.
Create a Google Kubernetes Engine cluster, and use horizontal pod autoscaling to scale the application.
Answers
B.
Create an instance template, and use the template in a managed instance group with autoscaling configured.
B.
Create an instance template, and use the template in a managed instance group with autoscaling configured.
Answers
C.
Create an instance template, and use the template in a managed instance group that scales up and down based on the time of day.
C.
Create an instance template, and use the template in a managed instance group that scales up and down based on the time of day.
Answers
D.
Use a set of third-party tools to build automation around scaling the application up and down, based on Stackdriver CPU usage monitoring.
D.
Use a set of third-party tools to build automation around scaling the application up and down, based on Stackdriver CPU usage monitoring.
Answers
Suggested answer: B

Explanation:

Managed instance groups offer autoscaling capabilities that let you automatically add or delete instances from a managed instance group based on increases or decreases in load (CPU Utilization in this case). Autoscaling helps your apps gracefully handle increases in traffic and reduce costs when the need for resources is lower. You define the autoscaling policy and the autoscaler performs automatic scaling based on the measured load (CPU Utilization in this case). Autoscaling works by adding more instances to your instance group when there is more load (upscaling), and deleting instances when the need for instances is lowered (downscaling). Ref: https://cloud.google.com/compute/docs/autoscaler

You are analyzing Google Cloud Platform service costs from three separate projects. You want to use this information to create service cost estimates by service type, daily and monthly, for the next six months using standard query syntax. What should you do?

A.
Export your bill to a Cloud Storage bucket, and then import into Cloud Bigtable for analysis.
A.
Export your bill to a Cloud Storage bucket, and then import into Cloud Bigtable for analysis.
Answers
B.
Export your bill to a Cloud Storage bucket, and then import into Google Sheets for analysis.
B.
Export your bill to a Cloud Storage bucket, and then import into Google Sheets for analysis.
Answers
C.
Export your transactions to a local file, and perform analysis with a desktop tool.
C.
Export your transactions to a local file, and perform analysis with a desktop tool.
Answers
D.
Export your bill to a BigQuery dataset, and then write time window-based SQL queries for analysis.
D.
Export your bill to a BigQuery dataset, and then write time window-based SQL queries for analysis.
Answers
Suggested answer: D

Explanation:

'...we recommend that you enable Cloud Billing data export to BigQuery at the same time that you create a Cloud Billing account. ' https://cloud.google.com/billing/docs/how-to/export-data-bigquery

https://medium.com/google-cloud/analyzing-google-cloud-billing-data-with-big-query-30bae1c2aae4

You need to set up a policy so that videos stored in a specific Cloud Storage Regional bucket are moved to Coldline after 90 days, and then deleted after one year from their creation. How should you set up the policy?

A.
Use Cloud Storage Object Lifecycle Management using Age conditions with SetStorageClass and Delete actions. Set the SetStorageClass action to 90 days and the Delete action to 275 days (365 -- 90)
A.
Use Cloud Storage Object Lifecycle Management using Age conditions with SetStorageClass and Delete actions. Set the SetStorageClass action to 90 days and the Delete action to 275 days (365 -- 90)
Answers
B.
Use Cloud Storage Object Lifecycle Management using Age conditions with SetStorageClass and Delete actions. Set the SetStorageClass action to 90 days and the Delete action to 365 days.
B.
Use Cloud Storage Object Lifecycle Management using Age conditions with SetStorageClass and Delete actions. Set the SetStorageClass action to 90 days and the Delete action to 365 days.
Answers
C.
Use gsutil rewrite and set the Delete action to 275 days (365-90).
C.
Use gsutil rewrite and set the Delete action to 275 days (365-90).
Answers
D.
Use gsutil rewrite and set the Delete action to 365 days.
D.
Use gsutil rewrite and set the Delete action to 365 days.
Answers
Suggested answer: A

Explanation:

https://cloud.google.com/storage/docs/lifecycle#setstorageclass-cost

# The object's time spent set at the original storage class counts towards any minimum storage duration that applies for the new storage class.

You have a Linux VM that must connect to Cloud SQL. You created a service account with the appropriate access rights. You want to make sure that the VM uses this service account instead of the default Compute Engine service account. What should you do?

A.
When creating the VM via the web console, specify the service account under the 'Identity and API Access' section.
A.
When creating the VM via the web console, specify the service account under the 'Identity and API Access' section.
Answers
B.
Download a JSON Private Key for the service account. On the Project Metadata, add that JSON as the value for the key compute-engine-service-account.
B.
Download a JSON Private Key for the service account. On the Project Metadata, add that JSON as the value for the key compute-engine-service-account.
Answers
C.
Download a JSON Private Key for the service account. On the Custom Metadata of the VM, add that JSON as the value for the key compute-engine-service-account.
C.
Download a JSON Private Key for the service account. On the Custom Metadata of the VM, add that JSON as the value for the key compute-engine-service-account.
Answers
D.
Download a JSON Private Key for the service account. After creating the VM, ssh into the VM and save the JSON under ~/.gcloud/compute-engine-service-account.json.
D.
Download a JSON Private Key for the service account. After creating the VM, ssh into the VM and save the JSON under ~/.gcloud/compute-engine-service-account.json.
Answers
Suggested answer: A

Explanation:

https://cloud.google.com/compute/docs/access/create-enable-service-accounts-for-instances

Changing the service account and access scopes for an instance If you want to run the VM as a different identity, or you determine that the instance needs a different set of scopes to call the required APIs, you can change the service account and the access scopes of an existing instance. For example, you can change access scopes to grant access to a new API, or change an instance so that it runs as a service account that you created, instead of the Compute Engine default service account. However, Google recommends that you use the fine-grained IAM policies instead of relying on access scopes to control resource access for the service account. To change an instance's service account and access scopes, the instance must be temporarily stopped. To stop your instance, read the documentation for Stopping an instance. After changing the service account or access scopes, remember to restart the instance. Use one of the following methods to the change service account or access scopes of the stopped instance.

You created an instance of SQL Server 2017 on Compute Engine to test features in the new version. You want to connect to this instance using the fewest number of steps. What should you do?

A.
Install a RDP client on your desktop. Verify that a firewall rule for port 3389 exists.
A.
Install a RDP client on your desktop. Verify that a firewall rule for port 3389 exists.
Answers
B.
Install a RDP client in your desktop. Set a Windows username and password in the GCP Console. Use the credentials to log in to the instance.
B.
Install a RDP client in your desktop. Set a Windows username and password in the GCP Console. Use the credentials to log in to the instance.
Answers
C.
Set a Windows password in the GCP Console. Verify that a firewall rule for port 22 exists. Click the RDP button in the GCP Console and supply the credentials to log in.
C.
Set a Windows password in the GCP Console. Verify that a firewall rule for port 22 exists. Click the RDP button in the GCP Console and supply the credentials to log in.
Answers
D.
Set a Windows username and password in the GCP Console. Verify that a firewall rule for port 3389 exists. Click the RDP button in the GCP Console, and supply the credentials to log in.
D.
Set a Windows username and password in the GCP Console. Verify that a firewall rule for port 3389 exists. Click the RDP button in the GCP Console, and supply the credentials to log in.
Answers
Suggested answer: D

Explanation:

https://cloud.google.com/compute/docs/instances/connecting-to-windows#remote-desktop-connection-app

https://cloud.google.com/compute/docs/instances/windows/generating-credentials

https://cloud.google.com/compute/docs/instances/connecting-to-windows#before-you-begin

Total 289 questions
Go to page: of 29