ExamGecko
Home Home / Google / Associate Cloud Engineer

Google Associate Cloud Engineer Practice Test - Questions Answers, Page 3

Question list
Search
Search

List of questions

Search

Related questions











You have one GCP account running in your default region and zone and another account running in a non-default region and zone. You want to start a new Compute Engine instance in these two Google Cloud Platform accounts using the command line interface. What should you do?

A.
Create two configurations using gcloud config configurations create [NAME]. Run gcloud config configurations activate [NAME] to switch between accounts when running the commands to start the Compute Engine instances.
A.
Create two configurations using gcloud config configurations create [NAME]. Run gcloud config configurations activate [NAME] to switch between accounts when running the commands to start the Compute Engine instances.
Answers
B.
Create two configurations using gcloud config configurations create [NAME]. Run gcloud configurations list to start the Compute Engine instances.
B.
Create two configurations using gcloud config configurations create [NAME]. Run gcloud configurations list to start the Compute Engine instances.
Answers
C.
Activate two configurations using gcloud configurations activate [NAME]. Run gcloud config list to start the Compute Engine instances.
C.
Activate two configurations using gcloud configurations activate [NAME]. Run gcloud config list to start the Compute Engine instances.
Answers
D.
Activate two configurations using gcloud configurations activate [NAME]. Run gcloud configurations list to start the Compute Engine instances.
D.
Activate two configurations using gcloud configurations activate [NAME]. Run gcloud configurations list to start the Compute Engine instances.
Answers
Suggested answer: A

Explanation:

Ref:https://cloud.google.com/sdk/gcloud/reference/config/configurations/activate

Finally, while each configuration is active, you can run the gcloud compute instances start [NAME] command to start the instance in the configurations region. https://cloud.google.com/sdk/gcloud/reference/compute/instances/start

You significantly changed a complex Deployment Manager template and want to confirm that the dependencies of all defined resources are properly met before committing it to the project. You want the most rapid feedback on your changes. What should you do?

A.
Use granular logging statements within a Deployment Manager template authored in Python.
A.
Use granular logging statements within a Deployment Manager template authored in Python.
Answers
B.
Monitor activity of the Deployment Manager execution on the Stackdriver Logging page of the GCP Console.
B.
Monitor activity of the Deployment Manager execution on the Stackdriver Logging page of the GCP Console.
Answers
C.
Execute the Deployment Manager template against a separate project with the same configuration, and monitor for failures.
C.
Execute the Deployment Manager template against a separate project with the same configuration, and monitor for failures.
Answers
D.
Execute the Deployment Manager template using the ---preview option in the same project, and observe the state of interdependent resources.
D.
Execute the Deployment Manager template using the ---preview option in the same project, and observe the state of interdependent resources.
Answers
Suggested answer: D

You are building a pipeline to process time-series data. Which Google Cloud Platform services should you put in boxes 1,2,3, and 4?


A.
Cloud Pub/Sub, Cloud Dataflow, Cloud Datastore, BigQuery
A.
Cloud Pub/Sub, Cloud Dataflow, Cloud Datastore, BigQuery
Answers
B.
Firebase Messages, Cloud Pub/Sub, Cloud Spanner, BigQuery
B.
Firebase Messages, Cloud Pub/Sub, Cloud Spanner, BigQuery
Answers
C.
Cloud Pub/Sub, Cloud Storage, BigQuery, Cloud Bigtable
C.
Cloud Pub/Sub, Cloud Storage, BigQuery, Cloud Bigtable
Answers
D.
Cloud Pub/Sub, Cloud Dataflow, Cloud Bigtable, BigQuery
D.
Cloud Pub/Sub, Cloud Dataflow, Cloud Bigtable, BigQuery
Answers
Suggested answer: D

Explanation:

https://cloud.google.com/blog/products/data-analytics/handling-duplicate-data-in-streaming-pipeline-using-pubsub-dataflow

https://cloud.google.com/bigtable/docs/schema-design-time-series

You have a project for your App Engine application that serves a development environment. The required testing has succeeded and you want to create a new project to serve as your production environment. What should you do?

A.
Use gcloud to create the new project, and then deploy your application to the new project.
A.
Use gcloud to create the new project, and then deploy your application to the new project.
Answers
B.
Use gcloud to create the new project and to copy the deployed application to the new project.
B.
Use gcloud to create the new project and to copy the deployed application to the new project.
Answers
C.
Create a Deployment Manager configuration file that copies the current App Engine deployment into a new project.
C.
Create a Deployment Manager configuration file that copies the current App Engine deployment into a new project.
Answers
D.
Deploy your application again using gcloud and specify the project parameter with the new project name to create the new project.
D.
Deploy your application again using gcloud and specify the project parameter with the new project name to create the new project.
Answers
Suggested answer: A

Explanation:

You can deploy to a different project by using --project flag.

By default, the service is deployed the current project configured via:

$ gcloud config set core/project PROJECT

To override this value for a single deployment, use the --project flag:

$ gcloud app deploy ~/my_app/app.yaml --project=PROJECT

Ref: https://cloud.google.com/sdk/gcloud/reference/app/deploy

You need to configure IAM access audit logging in BigQuery for external auditors. You want to follow Google-recommended practices. What should you do?

A.
Add the auditors group to the 'logging.viewer' and 'bigQuery.dataViewer' predefined IAM roles.
A.
Add the auditors group to the 'logging.viewer' and 'bigQuery.dataViewer' predefined IAM roles.
Answers
B.
Add the auditors group to two new custom IAM roles.
B.
Add the auditors group to two new custom IAM roles.
Answers
C.
Add the auditor user accounts to the 'logging.viewer' and 'bigQuery.dataViewer' predefined IAM roles.
C.
Add the auditor user accounts to the 'logging.viewer' and 'bigQuery.dataViewer' predefined IAM roles.
Answers
D.
Add the auditor user accounts to two new custom IAM roles.
D.
Add the auditor user accounts to two new custom IAM roles.
Answers
Suggested answer: A

Explanation:

https://cloud.google.com/iam/docs/job-functions/auditing#scenario_external_auditors

Because if you directly add users to the IAM roles, then if any users left the organization then you have to remove the users from multiple places and need to revoke his/her access from multiple places. But, if you put a user into a group then its very easy to manage these type of situations. Now, if any user left then you just need to remove the user from the group and all the access got revoked

The organization creates a Google group for these external auditors and adds the current auditor to the group. This group is monitored and is typically granted access to the dashboard application. During normal access, the auditors' Google group is only granted access to view the historic logs stored in BigQuery. If any anomalies are discovered, the group is granted permission to view the actual Cloud Logging Admin Activity logs via the dashboard's elevated access mode. At the end of each audit period, the group's access is then revoked. Data is redacted using Cloud DLP before being made accessible for viewing via the dashboard application. The table below explains IAM logging roles that an Organization Administrator can grant to the service account used by the dashboard, as well as the resource level at which the role is granted.

You need to set up permissions for a set of Compute Engine instances to enable them to write data into a particular Cloud Storage bucket. You want to follow Google-recommended practices. What should you do?

A.
Create a service account with an access scope. Use the access scope 'https://www.googleapis.com/auth/devstorage.write_only'.
A.
Create a service account with an access scope. Use the access scope 'https://www.googleapis.com/auth/devstorage.write_only'.
Answers
B.
Create a service account with an access scope. Use the access scope 'https://www.googleapis.com/auth/cloud-platform'.
B.
Create a service account with an access scope. Use the access scope 'https://www.googleapis.com/auth/cloud-platform'.
Answers
C.
Create a service account and add it to the IAM role 'storage.objectCreator' for that bucket.
C.
Create a service account and add it to the IAM role 'storage.objectCreator' for that bucket.
Answers
D.
Create a service account and add it to the IAM role 'storage.objectAdmin' for that bucket.
D.
Create a service account and add it to the IAM role 'storage.objectAdmin' for that bucket.
Answers
Suggested answer: C

Explanation:

https://cloud.google.com/iam/docs/understanding-service-accounts#using_service_accounts_with_compute_engine

https://cloud.google.com/storage/docs/access-control/iam-roles

You have sensitive data stored in three Cloud Storage buckets and have enabled data access logging. You want to verify activities for a particular user for these buckets, using the fewest possible steps. You need to verify the addition of metadata labels and which files have been viewed from those buckets. What should you do?

A.
Using the GCP Console, filter the Activity log to view the information.
A.
Using the GCP Console, filter the Activity log to view the information.
Answers
B.
Using the GCP Console, filter the Stackdriver log to view the information.
B.
Using the GCP Console, filter the Stackdriver log to view the information.
Answers
C.
View the bucket in the Storage section of the GCP Console.
C.
View the bucket in the Storage section of the GCP Console.
Answers
D.
Create a trace in Stackdriver to view the information.
D.
Create a trace in Stackdriver to view the information.
Answers
Suggested answer: A

Explanation:

https://cloud.google.com/storage/docs/audit-logs

https://cloud.google.com/compute/docs/logging/audit-logging#audited_operations

You are the project owner of a GCP project and want to delegate control to colleagues to manage buckets and files in Cloud Storage. You want to follow Google-recommended practices. Which IAM roles should you grant your colleagues?

A.
Project Editor
A.
Project Editor
Answers
B.
Storage Admin
B.
Storage Admin
Answers
C.
Storage Object Admin
C.
Storage Object Admin
Answers
D.
Storage Object Creator
D.
Storage Object Creator
Answers
Suggested answer: B

Explanation:

Storage Admin (roles/storage.admin) Grants full control of buckets and objects.

When applied to an individual bucket, control applies only to the specified bucket and objects within the bucket.

firebase.projects.get

resourcemanager.projects.get

resourcemanager.projects.list

storage.buckets.*

storage.objects.*

https://cloud.google.com/storage/docs/access-control/iam-roles

This role grants full control of buckets and objects. When applied to an individual bucket, control applies only to the specified bucket and objects within the bucket.

Ref:https://cloud.google.com/iam/docs/understanding-roles#storage-roles

You have an object in a Cloud Storage bucket that you want to share with an external company. The object contains sensitive data. You want access to the content to be removed after four hours. The external company does not have a Google account to which you can grant specific user-based access privileges. You want to use the most secure method that requires the fewest steps. What should you do?


A.
Create a signed URL with a four-hour expiration and share the URL with the company.
A.
Create a signed URL with a four-hour expiration and share the URL with the company.
Answers
B.
Set object access to 'public' and use object lifecycle management to remove the object after four hours.
B.
Set object access to 'public' and use object lifecycle management to remove the object after four hours.
Answers
C.
Configure the storage bucket as a static website and furnish the object's URL to the company. Delete the object from the storage bucket after four hours.
C.
Configure the storage bucket as a static website and furnish the object's URL to the company. Delete the object from the storage bucket after four hours.
Answers
D.
Create a new Cloud Storage bucket specifically for the external company to access. Copy the object to that bucket. Delete the bucket after four hours have passed.
D.
Create a new Cloud Storage bucket specifically for the external company to access. Copy the object to that bucket. Delete the bucket after four hours have passed.
Answers
Suggested answer: A

Explanation:

Signed URLs are used to give time-limited resource access to anyone in possession of the URL, regardless of whether they have a Google account. https://cloud.google.com/storage/docs/access-control/signed-urls

You are creating a Google Kubernetes Engine (GKE) cluster with a cluster autoscaler feature enabled. You need to make sure that each node of the cluster will run a monitoring pod that sends container metrics to a third-party monitoring solution. What should you do?

A.
Deploy the monitoring pod in a StatefulSet object.
A.
Deploy the monitoring pod in a StatefulSet object.
Answers
B.
Deploy the monitoring pod in a DaemonSet object.
B.
Deploy the monitoring pod in a DaemonSet object.
Answers
C.
Reference the monitoring pod in a Deployment object.
C.
Reference the monitoring pod in a Deployment object.
Answers
D.
Reference the monitoring pod in a cluster initializer at the GKE cluster creation time.
D.
Reference the monitoring pod in a cluster initializer at the GKE cluster creation time.
Answers
Suggested answer: B

Explanation:

https://cloud.google.com/kubernetes-engine/docs/concepts/daemonset

https://cloud.google.com/kubernetes-engine/docs/concepts/daemonset#usage_patterns

DaemonSets attempt to adhere to a one-Pod-per-node model, either across the entire cluster or a subset of nodes. As you add nodes to a node pool, DaemonSets automatically add Pods to the new nodes as needed.

In GKE, DaemonSets manage groups of replicated Pods and adhere to a one-Pod-per-node model, either across the entire cluster or a subset of nodes. As you add nodes to a node pool, DaemonSets automatically add Pods to the new nodes as needed. So, this is a perfect fit for our monitoring pod.

Ref:https://cloud.google.com/kubernetes-engine/docs/concepts/daemonset

DaemonSets are useful for deploying ongoing background tasks that you need to run on all or certain nodes, and which do not require user intervention. Examples of such tasks include storage daemons like ceph, log collection daemons like fluentd, and node monitoring daemons like collectd. For example, you could have DaemonSets for each type of daemon run on all of your nodes. Alternatively, you could run multiple DaemonSets for a single type of daemon, but have them use different configurations for different hardware types and resource needs.

Total 289 questions
Go to page: of 29