ExamGecko
Home Home / Google / Associate Cloud Engineer

Google Associate Cloud Engineer Practice Test - Questions Answers, Page 16

Question list
Search
Search

List of questions

Search

Related questions











You are working with a user to set up an application in a new VPC behind a firewall. The user is concerned about data egress. You want to configure the fewest open egress ports. What should you do?

A.
Set up a low-priority (65534) rule that blocks all egress and a high-priority rule (1000) that allows only the appropriate ports.
A.
Set up a low-priority (65534) rule that blocks all egress and a high-priority rule (1000) that allows only the appropriate ports.
Answers
B.
Set up a high-priority (1000) rule that pairs both ingress and egress ports.
B.
Set up a high-priority (1000) rule that pairs both ingress and egress ports.
Answers
C.
Set up a high-priority (1000) rule that blocks all egress and a low-priority (65534) rule that allows only the appropriate ports.
C.
Set up a high-priority (1000) rule that blocks all egress and a low-priority (65534) rule that allows only the appropriate ports.
Answers
D.
Set up a high-priority (1000) rule to allow the appropriate ports.
D.
Set up a high-priority (1000) rule to allow the appropriate ports.
Answers
Suggested answer: A

Explanation:

Implied rules Every VPC network has two implied firewall rules. These rules exist, but are not shown in the Cloud Console: Implied allow egress rule. An egress rule whose action is allow, destination is 0.0.0.0/0, and priority is the lowest possible (65535) lets any instance send traffic to any destination, except for traffic blocked by Google Cloud. A higher priority firewall rule may restrict outbound access. Internet access is allowed if no other firewall rules deny outbound traffic and if the instance has an external IP address or uses a Cloud NAT instance. For more information, see Internet access requirements. Implied deny ingress rule. An ingress rule whose action is deny, source is 0.0.0.0/0, and priority is the lowest possible (65535) protects all instances by blocking incoming connections to them. A higher priority rule might allow incoming access. The default network includes some additional rules that override this one, allowing certain types of incoming connections. https://cloud.google.com/vpc/docs/firewalls#default_firewall_rules

Your company runs its Linux workloads on Compute Engine instances. Your company will be working with a new operations partner that does not use Google Accounts. You need to grant access to the instances to your operations partner so they can maintain the installed tooling. What should you do?

A.
Enable Cloud IAP for the Compute Engine instances, and add the operations partner as a Cloud IAP Tunnel User.
A.
Enable Cloud IAP for the Compute Engine instances, and add the operations partner as a Cloud IAP Tunnel User.
Answers
B.
Tag all the instances with the same network tag. Create a firewall rule in the VPC to grant TCP access on port 22 for traffic from the operations partner to instances with the network tag.
B.
Tag all the instances with the same network tag. Create a firewall rule in the VPC to grant TCP access on port 22 for traffic from the operations partner to instances with the network tag.
Answers
C.
Set up Cloud VPN between your Google Cloud VPC and the internal network of the operations partner.
C.
Set up Cloud VPN between your Google Cloud VPC and the internal network of the operations partner.
Answers
D.
Ask the operations partner to generate SSH key pairs, and add the public keys to the VM instances.
D.
Ask the operations partner to generate SSH key pairs, and add the public keys to the VM instances.
Answers
Suggested answer: D

Explanation:

IAP controls access to your App Engine apps and Compute Engine VMs running on Google Cloud. It leverages user identity and the context of a request to determine if a user should be allowed access. IAP is a building block toward BeyondCorp, an enterprise security model that enables employees to work from untrusted networks without using a VPN.

By default, IAP uses Google identities and IAM. By leveraging Identity Platform instead, you can authenticate users with a wide range of external identity providers, such as:

Email/password

OAuth (Google, Facebook, Twitter, GitHub, Microsoft, etc.)

SAML

OIDC

Phone number

Custom

Anonymous

This is useful if your application is already using an external authentication system, and migrating your users to Google accounts is impractical.

https://cloud.google.com/iap/docs/using-tcp-forwarding#grant-permission

You have created a code snippet that should be triggered whenever a new file is uploaded to a Cloud Storage bucket. You want to deploy this code snippet. What should you do?

A.
Use App Engine and configure Cloud Scheduler to trigger the application using Pub/Sub.
A.
Use App Engine and configure Cloud Scheduler to trigger the application using Pub/Sub.
Answers
B.
Use Cloud Functions and configure the bucket as a trigger resource.
B.
Use Cloud Functions and configure the bucket as a trigger resource.
Answers
C.
Use Google Kubernetes Engine and configure a CronJob to trigger the application using Pub/Sub.
C.
Use Google Kubernetes Engine and configure a CronJob to trigger the application using Pub/Sub.
Answers
D.
Use Dataflow as a batch job, and configure the bucket as a data source.
D.
Use Dataflow as a batch job, and configure the bucket as a data source.
Answers
Suggested answer: B

Explanation:

Google Cloud Storage Triggers

Cloud Functions can respond to change notifications emerging from Google Cloud Storage. These notifications can be configured to trigger in response to various events inside a bucket---object creation, deletion, archiving and metadata updates.

Note: Cloud Functions can only be triggered by Cloud Storage buckets in the same Google Cloud Platform project.

Event types

Cloud Storage events used by Cloud Functions are based on Cloud Pub/Sub Notifications for Google Cloud Storage and can be configured in a similar way.

Supported trigger type values are:

google.storage.object.finalize

google.storage.object.delete

google.storage.object.archive

google.storage.object.metadataUpdate

Object Finalize

Trigger type value: google.storage.object.finalize

This event is sent when a new object is created (or an existing object is overwritten, and a new generation of that object is created) in the bucket.

https://cloud.google.com/functions/docs/calling/storage#event_types

You have been asked to set up Object Lifecycle Management for objects stored in storage buckets. The objects are written once and accessed frequently for 30 days. After 30 days, the objects are not read again unless there is a special need. The object should be kept for three years, and you need to minimize cost. What should you do?

A.
Set up a policy that uses Nearline storage for 30 days and then moves to Archive storage for three years.
A.
Set up a policy that uses Nearline storage for 30 days and then moves to Archive storage for three years.
Answers
B.
Set up a policy that uses Standard storage for 30 days and then moves to Archive storage for three years.
B.
Set up a policy that uses Standard storage for 30 days and then moves to Archive storage for three years.
Answers
C.
Set up a policy that uses Nearline storage for 30 days, then moves the Coldline for one year, and then moves to Archive storage for two years.
C.
Set up a policy that uses Nearline storage for 30 days, then moves the Coldline for one year, and then moves to Archive storage for two years.
Answers
D.
Set up a policy that uses Standard storage for 30 days, then moves to Coldline for one year, and then moves to Archive storage for two years.
D.
Set up a policy that uses Standard storage for 30 days, then moves to Coldline for one year, and then moves to Archive storage for two years.
Answers
Suggested answer: B

Explanation:

The key to understand the requirement is : 'The objects are written once and accessed frequently for 30 days'

Standard Storage

Standard Storage is best for data that is frequently accessed ('hot' data) and/or stored for only brief periods of time.

Archive Storage

Archive Storage is the lowest-cost, highly durable storage service for data archiving, online backup, and disaster recovery. Unlike the 'coldest' storage services offered by other Cloud providers, your data is available within milliseconds, not hours or days. Archive Storage is the best choice for data that you plan to access less than once a year.

https://cloud.google.com/storage/docs/storage-classes#standard

You are storing sensitive information in a Cloud Storage bucket. For legal reasons, you need to be able to record all requests that read any of the stored dat

A.
You want to make sure you comply with these requirements. What should you do?
A.
You want to make sure you comply with these requirements. What should you do?
Answers
B.
Enable the Identity Aware Proxy API on the project.
B.
Enable the Identity Aware Proxy API on the project.
Answers
C.
Scan the bucker using the Data Loss Prevention API.
C.
Scan the bucker using the Data Loss Prevention API.
Answers
D.
Allow only a single Service Account access to read the data.
D.
Allow only a single Service Account access to read the data.
Answers
E.
Enable Data Access audit logs for the Cloud Storage API.
E.
Enable Data Access audit logs for the Cloud Storage API.
Answers
Suggested answer: D

Explanation:

Logged information Within Cloud Audit Logs, there are two types of logs: Admin Activity logs: Entries for operations that modify the configuration or metadata of a project, bucket, or object. Data Access logs: Entries for operations that modify objects or read a project, bucket, or object. There are several sub-types of data access logs: ADMIN_READ: Entries for operations that read the configuration or metadata of a project, bucket, or object. DATA_READ: Entries for operations that read an object. DATA_WRITE: Entries for operations that create or modify an object. https://cloud.google.com/storage/docs/audit-logs#types

You are the team lead of a group of 10 developers. You provided each developer with an individual Google Cloud Project that they can use as their personal sandbox to experiment with different Google Cloud solutions. You want to be notified if any of the developers are spending above $500 per month on their sandbox environment. What should you do?

A.
Create a single budget for all projects and configure budget alerts on this budget.
A.
Create a single budget for all projects and configure budget alerts on this budget.
Answers
B.
Create a separate billing account per sandbox project and enable BigQuery billing exports. Create a Data Studio dashboard to plot the spending per billing account.
B.
Create a separate billing account per sandbox project and enable BigQuery billing exports. Create a Data Studio dashboard to plot the spending per billing account.
Answers
C.
Create a budget per project and configure budget alerts on all of these budgets.
C.
Create a budget per project and configure budget alerts on all of these budgets.
Answers
D.
Create a single billing account for all sandbox projects and enable BigQuery billing exports. Create a Data Studio dashboard to plot the spending per project.
D.
Create a single billing account for all sandbox projects and enable BigQuery billing exports. Create a Data Studio dashboard to plot the spending per project.
Answers
Suggested answer: C

Explanation:

Set budgets and budget alerts Overview Avoid surprises on your bill by creating Cloud Billing budgets to monitor all of your Google Cloud charges in one place. A budget enables you to track your actual Google Cloud spend against your planned spend. After you've set a budget amount, you set budget alert threshold rules that are used to trigger email notifications. Budget alert emails help you stay informed about how your spend is tracking against your budget. 2. Set budget scope Set the budget Scope and then click Next. In the Projects field, select one or more projects that you want to apply the budget alert to. To apply the budget alert to all the projects in the Cloud Billing account, choose Select all. https://cloud.google.com/billing/docs/how-to/budgets#budget-scop

You are deploying a production application on Compute Engine. You want to prevent anyone from accidentally destroying the instance by clicking the wrong button. What should you do?

A.
Disable the flag ''Delete boot disk when instance is deleted.''
A.
Disable the flag ''Delete boot disk when instance is deleted.''
Answers
B.
Enable delete protection on the instance.
B.
Enable delete protection on the instance.
Answers
C.
Disable Automatic restart on the instance.
C.
Disable Automatic restart on the instance.
Answers
D.
Enable Preemptibility on the instance.
D.
Enable Preemptibility on the instance.
Answers
Suggested answer: D

Explanation:


Your company uses a large number of Google Cloud services centralized in a single project. All teams have specific projects for testing and development. The DevOps team needs access to all of the production services in order to perform their job. You want to prevent Google Cloud product changes from broadening their permissions in the future. You want to follow Google-recommended practices. What should you do?

A.
Grant all members of the DevOps team the role of Project Editor on the organization level.
A.
Grant all members of the DevOps team the role of Project Editor on the organization level.
Answers
B.
Grant all members of the DevOps team the role of Project Editor on the production project.
B.
Grant all members of the DevOps team the role of Project Editor on the production project.
Answers
C.
Create a custom role that combines the required permissions. Grant the DevOps team the custom role on the production project.
C.
Create a custom role that combines the required permissions. Grant the DevOps team the custom role on the production project.
Answers
D.
Create a custom role that combines the required permissions. Grant the DevOps team the custom role on the organization level.
D.
Create a custom role that combines the required permissions. Grant the DevOps team the custom role on the organization level.
Answers
Suggested answer: C

Explanation:


You are building an application that processes data files uploaded from thousands of suppliers. Your primary goals for the application are data security and the expiration of aged data. You need to design the application to:

* Restrict access so that suppliers can access only their own data.

* Give suppliers write access to data only for 30 minutes.

* Delete data that is over 45 days old.

You have a very short development cycle, and you need to make sure that the application requires minimal maintenance. Which two strategies should you use? (Choose two.)


A.
Build a lifecycle policy to delete Cloud Storage objects after 45 days.
A.
Build a lifecycle policy to delete Cloud Storage objects after 45 days.
Answers
B.
Use signed URLs to allow suppliers limited time access to store their objects.
B.
Use signed URLs to allow suppliers limited time access to store their objects.
Answers
C.
Set up an SFTP server for your application, and create a separate user for each supplier.
C.
Set up an SFTP server for your application, and create a separate user for each supplier.
Answers
D.
Build a Cloud function that triggers a timer of 45 days to delete objects that have expired.
D.
Build a Cloud function that triggers a timer of 45 days to delete objects that have expired.
Answers
E.
Develop a script that loops through all Cloud Storage buckets and deletes any buckets that are older than 45 days.
E.
Develop a script that loops through all Cloud Storage buckets and deletes any buckets that are older than 45 days.
Answers
Suggested answer: A, B

Explanation:

(A) Object Lifecycle Management

Delete

The Delete action deletes an object when the object meets all conditions specified in the lifecycle rule.

Exception: In buckets with Object Versioning enabled, deleting the live version of an object causes it to become a noncurrent version, while deleting a noncurrent version deletes that version permanently.

https://cloud.google.com/storage/docs/lifecycle#delete

(B) Signed URLs

This page provides an overview of signed URLs, which you use to give time-limited resource access to anyone in possession of the URL, regardless of whether they have a Google account

https://cloud.google.com/storage/docs/access-control/signed-urls

Your auditor wants to view your organization's use of data in Google Cloud. The auditor is most interested in auditing who accessed data in Cloud Storage buckets. You need to help the auditor access the data they need. What should you do?

A.
Assign the appropriate permissions, and then use Cloud Monitoring to review metrics
A.
Assign the appropriate permissions, and then use Cloud Monitoring to review metrics
Answers
B.
Use the export logs API to provide the Admin Activity Audit Logs in the format they want
B.
Use the export logs API to provide the Admin Activity Audit Logs in the format they want
Answers
C.
Turn on Data Access Logs for the buckets they want to audit, and Then build a query in the log viewer that filters on Cloud Storage
C.
Turn on Data Access Logs for the buckets they want to audit, and Then build a query in the log viewer that filters on Cloud Storage
Answers
D.
Assign the appropriate permissions, and then create a Data Studio report on Admin Activity Audit Logs
D.
Assign the appropriate permissions, and then create a Data Studio report on Admin Activity Audit Logs
Answers
Suggested answer: C

Explanation:

Types of audit logs Cloud Audit Logs provides the following audit logs for each Cloud project, folder, and organization: Admin Activity audit logs Data Access audit logs System Event audit logs Policy Denied audit logs ***Data Access audit logs contain API calls that read the configuration or metadata of resources, as well as user-driven API calls that create, modify, or read user-provided resource data. https://cloud.google.com/logging/docs/audit#types

https://cloud.google.com/logging/docs/audit#data-access Cloud Storage: When Cloud Storage usage logs are enabled, Cloud Storage writes usage data to the Cloud Storage bucket, which generates Data Access audit logs for the bucket. The generated Data Access audit log has its caller identity redacted.

Total 289 questions
Go to page: of 29