ExamGecko
Home Home / Google / Associate Cloud Engineer

Google Associate Cloud Engineer Practice Test - Questions Answers, Page 12

Question list
Search
Search

List of questions

Search

Related questions











Your management has asked an external auditor to review all the resources in a specific project. The security team has enabled the Organization Policy called Domain Restricted Sharing on the organization node by specifying only your Cloud Identity domain. You want the auditor to only be able to view, but not modify, the resources in that project. What should you do?

A.
Ask the auditor for their Google account, and give them the Viewer role on the project.
A.
Ask the auditor for their Google account, and give them the Viewer role on the project.
Answers
B.
Ask the auditor for their Google account, and give them the Security Reviewer role on the project.
B.
Ask the auditor for their Google account, and give them the Security Reviewer role on the project.
Answers
C.
Create a temporary account for the auditor in Cloud Identity, and give that account the Viewer role on the project.
C.
Create a temporary account for the auditor in Cloud Identity, and give that account the Viewer role on the project.
Answers
D.
Create a temporary account for the auditor in Cloud Identity, and give that account the Security Reviewer role on the project.
D.
Create a temporary account for the auditor in Cloud Identity, and give that account the Security Reviewer role on the project.
Answers
Suggested answer: C

Explanation:

Using primitive roles The following table lists the primitive roles that you can grant to access a project, the description of what the role does, and the permissions bundled within that role. Avoid using primitive roles except when absolutely necessary. These roles are very powerful, and include a large number of permissions across all Google Cloud services. For more details on when you should use primitive roles, see the Identity and Access Management FAQ. IAM predefined roles are much more granular, and allow you to carefully manage the set of permissions that your users have access to. See Understanding Roles for a list of roles that can be granted at the project level. Creating custom roles can further increase the control you have over user permissions. https://cloud.google.com/resource-manager/docs/access-control-proj#using_primitive_roles

https://cloud.google.com/iam/docs/understanding-custom-roles

You have a workload running on Compute Engine that is critical to your business. You want to ensure that the data on the boot disk of this workload is backed up regularly. You need to be able to restore a backup as quickly as possible in case of disaster. You also want older backups to be cleaned automatically to save on cost. You want to follow Google-recommended practices. What should you do?

A.
Create a Cloud Function to create an instance template.
A.
Create a Cloud Function to create an instance template.
Answers
B.
Create a snapshot schedule for the disk using the desired interval.
B.
Create a snapshot schedule for the disk using the desired interval.
Answers
C.
Create a cron job to create a new disk from the disk using gcloud.
C.
Create a cron job to create a new disk from the disk using gcloud.
Answers
D.
Create a Cloud Task to create an image and export it to Cloud Storage.
D.
Create a Cloud Task to create an image and export it to Cloud Storage.
Answers
Suggested answer: B

Explanation:

Best practices for persistent disk snapshots

You can create persistent disk snapshots at any time, but you can create snapshots more quickly and with greater reliability if you use the following best practices.

Creating frequent snapshots efficiently

Use snapshots to manage your data efficiently.

Create a snapshot of your data on a regular schedule to minimize data loss due to unexpected failure.

Improve performance by eliminating excessive snapshot downloads and by creating an image and reusing it.

Set your snapshot schedule to off-peak hours to reduce snapshot time.

Snapshot frequency limits

Creating snapshots from persistent disks

You can snapshot your disks at most once every 10 minutes. If you want to issue a burst of requests to snapshot your disks, you can issue at most 6 requests in 60 minutes.

If the limit is exceeded, the operation fails and returns the following error:

https://cloud.google.com/compute/docs/disks/snapshot-best-practices

You need to assign a Cloud Identity and Access Management (Cloud IAM) role to an external auditor. The auditor needs to have permissions to review your Google Cloud Platform (GCP) Audit Logs and also to review your Data Access logs. What should you do?

A.
Assign the auditor the IAM role roles/logging.privateLogViewer. Perform the export of logs to Cloud Storage.
A.
Assign the auditor the IAM role roles/logging.privateLogViewer. Perform the export of logs to Cloud Storage.
Answers
B.
Assign the auditor the IAM role roles/logging.privateLogViewer. Direct the auditor to also review the logs for changes to Cloud IAM policy.
B.
Assign the auditor the IAM role roles/logging.privateLogViewer. Direct the auditor to also review the logs for changes to Cloud IAM policy.
Answers
C.
Assign the auditor's IAM user to a custom role that has logging.privateLogEntries.list permission. Perform the export of logs to Cloud Storage.
C.
Assign the auditor's IAM user to a custom role that has logging.privateLogEntries.list permission. Perform the export of logs to Cloud Storage.
Answers
D.
Assign the auditor's IAM user to a custom role that has logging.privateLogEntries.list permission. Direct the auditor to also review the logs for changes to Cloud IAM policy.
D.
Assign the auditor's IAM user to a custom role that has logging.privateLogEntries.list permission. Direct the auditor to also review the logs for changes to Cloud IAM policy.
Answers
Suggested answer: B

Explanation:

Google Cloud provides Cloud Audit Logs, which is an integral part of Cloud Logging. It consists of two log streams for each project: Admin Activity and Data Access, which are generated by Google Cloud services to help you answer the question of who did what, where, and when? within your Google Cloud projects.

Ref:https://cloud.google.com/iam/docs/job-functions/auditing#scenario_external_auditors

You are managing several Google Cloud Platform (GCP) projects and need access to all logs for the past 60 days. You want to be able to explore and quickly analyze the log contents. You want to follow Google- recommended practices to obtain the combined logs for all projects. What should you do?

A.
Navigate to Stackdriver Logging and select resource.labels.project_id='*'
A.
Navigate to Stackdriver Logging and select resource.labels.project_id='*'
Answers
B.
Create a Stackdriver Logging Export with a Sink destination to a BigQuery dataset. Configure the table expiration to 60 days.
B.
Create a Stackdriver Logging Export with a Sink destination to a BigQuery dataset. Configure the table expiration to 60 days.
Answers
C.
Create a Stackdriver Logging Export with a Sink destination to Cloud Storage. Create a lifecycle rule to delete objects after 60 days.
C.
Create a Stackdriver Logging Export with a Sink destination to Cloud Storage. Create a lifecycle rule to delete objects after 60 days.
Answers
D.
Configure a Cloud Scheduler job to read from Stackdriver and store the logs in BigQuery. Configure the table expiration to 60 days.
D.
Configure a Cloud Scheduler job to read from Stackdriver and store the logs in BigQuery. Configure the table expiration to 60 days.
Answers
Suggested answer: B

Explanation:

Ref:https://cloud.google.com/logging/docs/export/aggregated_sinks

Either way, we now have the data in a BigQuery Dataset. Querying information from a Big Query dataset is easier and quicker than analyzing contents in Cloud Storage bucket. As our requirement is to Quickly analyze the log contents, we should prefer Big Query over Cloud Storage.

Also, You can control storage costs and optimize storage usage by setting the default table expiration for newly created tables in a dataset. If you set the property when the dataset is created, any table created in the dataset is deleted after the expiration period. If you set the property after the dataset is created, only new tables are deleted after the expiration period. For example, if you set the default table expiration to 7 days, older data is automatically deleted after 1 week. Ref:https://cloud.google.com/bigquery/docs/best-practices-storage

You need to reduce GCP service costs for a division of your company using the fewest possible steps. You need to turn off all configured services in an existing GCP project. What should you do?

A.
1. Verify that you are assigned the Project Owners IAM role for this project. 2. Locate the project in the GCP console, click Shut down and then enter the project ID.
A.
1. Verify that you are assigned the Project Owners IAM role for this project. 2. Locate the project in the GCP console, click Shut down and then enter the project ID.
Answers
B.
1. Verify that you are assigned the Project Owners IAM role for this project. 2. Switch to the project in the GCP console, locate the resources and delete them.
B.
1. Verify that you are assigned the Project Owners IAM role for this project. 2. Switch to the project in the GCP console, locate the resources and delete them.
Answers
C.
1. Verify that you are assigned the Organizational Administrator IAM role for this project. 2. Locate the project in the GCP console, enter the project ID and then click Shut down.
C.
1. Verify that you are assigned the Organizational Administrator IAM role for this project. 2. Locate the project in the GCP console, enter the project ID and then click Shut down.
Answers
D.
1. Verify that you are assigned the Organizational Administrators IAM role for this project. 2. Switch to the project in the GCP console, locate the resources and delete them.
D.
1. Verify that you are assigned the Organizational Administrators IAM role for this project. 2. Switch to the project in the GCP console, locate the resources and delete them.
Answers
Suggested answer: A

Explanation:

https://cloud.google.com/run/docs/tutorials/gcloud

https://cloud.google.com/resource-manager/docs/creating-managing-projects

https://cloud.google.com/iam/docs/understanding-roles#primitive_roles

You can shut down projects using the Cloud Console. When you shut down a project, this immediately happens: All billing and traffic serving stops, You lose access to the project, The owners of the project will be notified and can stop the deletion within 30 days, The project will be scheduled to be deleted after 30 days. However, some resources may be deleted much earlier.

You are configuring service accounts for an application that spans multiple projects. Virtual machines (VMs) running in the web-applications project need access to BigQuery datasets in crm-databases-proj. You want to follow Google-recommended practices to give access to the service account in the web-applications project. What should you do?

A.
Give ''project owner'' for web-applications appropriate roles to crm-databases- proj
A.
Give ''project owner'' for web-applications appropriate roles to crm-databases- proj
Answers
B.
Give ''project owner'' role to crm-databases-proj and the web-applications project.
B.
Give ''project owner'' role to crm-databases-proj and the web-applications project.
Answers
C.
Give ''project owner'' role to crm-databases-proj and bigquery.dataViewer role to web-applications.
C.
Give ''project owner'' role to crm-databases-proj and bigquery.dataViewer role to web-applications.
Answers
D.
Give bigquery.dataViewer role to crm-databases-proj and appropriate roles to web-applications.
D.
Give bigquery.dataViewer role to crm-databases-proj and appropriate roles to web-applications.
Answers
Suggested answer: C

Explanation:

bigquery.dataViewer role provides permissions to read the datasets metadata and list tables in the dataset as well as Read data and metadata from the datasets tables. This is exactly what we need to fulfil this requirement and follows the least privilege principle.

Ref:https://cloud.google.com/iam/docs/understanding-roles#bigquery-roles

An employee was terminated, but their access to Google Cloud Platform (GCP) was not removed until 2 weeks later. You need to find out this employee accessed any sensitive customer information after their termination. What should you do?

A.
View System Event Logs in Stackdriver. Search for the user's email as the principal.
A.
View System Event Logs in Stackdriver. Search for the user's email as the principal.
Answers
B.
View System Event Logs in Stackdriver. Search for the service account associated with the user.
B.
View System Event Logs in Stackdriver. Search for the service account associated with the user.
Answers
C.
View Data Access audit logs in Stackdriver. Search for the user's email as the principal.
C.
View Data Access audit logs in Stackdriver. Search for the user's email as the principal.
Answers
D.
View the Admin Activity log in Stackdriver. Search for the service account associated with the user.
D.
View the Admin Activity log in Stackdriver. Search for the service account associated with the user.
Answers
Suggested answer: C

Explanation:

https://cloud.google.com/logging/docs/audit

Data Access audit logs Data Access audit logs contain API calls that read the configuration or metadata of resources, as well as user-driven API calls that create, modify, or read user-provided resource data.

https://cloud.google.com/logging/docs/audit#data-access

You need to create a custom IAM role for use with a GCP service. All permissions in the role must be suitable for production use. You also want to clearly share with your organization the status of the custom role. This will be the first version of the custom role. What should you do?

A.
Use permissions in your role that use the 'supported' support level for role permissions. Set the role stage to ALPHA while testing the role permissions.
A.
Use permissions in your role that use the 'supported' support level for role permissions. Set the role stage to ALPHA while testing the role permissions.
Answers
B.
Use permissions in your role that use the 'supported' support level for role permissions. Set the role stage to BETA while testing the role permissions.
B.
Use permissions in your role that use the 'supported' support level for role permissions. Set the role stage to BETA while testing the role permissions.
Answers
C.
Use permissions in your role that use the 'testing' support level for role permissions. Set the role stage to ALPHA while testing the role permissions.
C.
Use permissions in your role that use the 'testing' support level for role permissions. Set the role stage to ALPHA while testing the role permissions.
Answers
D.
Use permissions in your role that use the 'testing' support level for role permissions. Set the role stage to BETA while testing the role permissions.
D.
Use permissions in your role that use the 'testing' support level for role permissions. Set the role stage to BETA while testing the role permissions.
Answers
Suggested answer: A

Explanation:

When setting support levels for permissions in custom roles, you can set to one of SUPPORTED, TESTING or NOT_SUPPORTED.

Ref:https://cloud.google.com/iam/docs/custom-roles-permissions-support

Your company has a large quantity of unstructured data in different file formats. You want to perform ETL transformations on the data. You need to make the data accessible on Google Cloud so it can be processed by a Dataflow job. What should you do?


A.
Upload the data to BigQuery using the bq command line tool.
A.
Upload the data to BigQuery using the bq command line tool.
Answers
B.
Upload the data to Cloud Storage using the gsutil command line tool.
B.
Upload the data to Cloud Storage using the gsutil command line tool.
Answers
C.
Upload the data into Cloud SQL using the import function in the console.
C.
Upload the data into Cloud SQL using the import function in the console.
Answers
D.
Upload the data into Cloud Spanner using the import function in the console.
D.
Upload the data into Cloud Spanner using the import function in the console.
Answers
Suggested answer: B

Explanation:

'large quantity' : Cloud Storage or BigQuery 'files' a file is nothing but an Object

You need to manage multiple Google Cloud Platform (GCP) projects in the fewest steps possible. You want to configure the Google Cloud SDK command line interface (CLI) so that you can easily manage multiple GCP projects. What should you?

A.
1. Create a configuration for each project you need to manage. 2. Activate the appropriate configuration when you work with each of your assigned GCP projects.
A.
1. Create a configuration for each project you need to manage. 2. Activate the appropriate configuration when you work with each of your assigned GCP projects.
Answers
B.
1. Create a configuration for each project you need to manage. 2. Use gcloud init to update the configuration values when you need to work with a non-default project
B.
1. Create a configuration for each project you need to manage. 2. Use gcloud init to update the configuration values when you need to work with a non-default project
Answers
C.
1. Use the default configuration for one project you need to manage. 2. Activate the appropriate configuration when you work with each of your assigned GCP projects.
C.
1. Use the default configuration for one project you need to manage. 2. Activate the appropriate configuration when you work with each of your assigned GCP projects.
Answers
D.
1. Use the default configuration for one project you need to manage. 2. Use gcloud init to update the configuration values when you need to work with a non-default project.
D.
1. Use the default configuration for one project you need to manage. 2. Use gcloud init to update the configuration values when you need to work with a non-default project.
Answers
Suggested answer: A

Explanation:

https://cloud.google.com/sdk/gcloud

https://cloud.google.com/sdk/docs/configurations#multiple_configurations

Total 289 questions
Go to page: of 29