ExamGecko
Home / Google / Associate Data Practitioner / List of questions
Ask Question

Google Associate Data Practitioner Practice Test - Questions Answers, Page 8

Add to Whishlist

List of questions

Question 71

Report Export Collapse

You are constructing a data pipeline to process sensitive customer data stored in a Cloud Storage bucket. You need to ensure that this data remains accessible, even in the event of a single-zone outage. What should you do?

Become a Premium Member for full access
  Unlock Premium Member

Question 72

Report Export Collapse

Your retail company collects customer data from various sources:

You are designing a data pipeline to extract this data. Which Google Cloud storage system(s) should you select for further analysis and ML model training?

Become a Premium Member for full access
  Unlock Premium Member

Question 73

Report Export Collapse

You are storing data in Cloud Storage for a machine learning project. The data is frequently accessed during the model training phase, minimally accessed after 30 days, and unlikely to be accessed after 90 days. You need to choose the appropriate storage class for the different stages of the project to minimize cost. What should you do?

Become a Premium Member for full access
  Unlock Premium Member

Question 74

Report Export Collapse

You need to design a data pipeline to process large volumes of raw server log data stored in Cloud Storage. The data needs to be cleaned, transformed, and aggregated before being loaded into BigQuery for analysis. The transformation involves complex data manipulation using Spark scripts that your team developed. You need to implement a solution that leverages your team's existing skillset, processes data at scale, and minimizes cost. What should you do?

Become a Premium Member for full access
  Unlock Premium Member

Question 75

Report Export Collapse

Your organization is building a new application on Google Cloud. Several data files will need to be stored in Cloud Storage. Your organization has approved only two specific cloud regions where these data files can reside. You need to determine a Cloud Storage bucket strategy that includes automated high availability. What should you do?

Become a Premium Member for full access
  Unlock Premium Member

Question 76

Report Export Collapse

You are designing an application that will interact with several BigQuery datasets. You need to grant the application's service account permissions that allow it to query and update tables within the datasets, and list all datasets in a project within your application. You want to follow the principle of least privilege. Which pre-defined IAM role(s) should you apply to the service account?

Become a Premium Member for full access
  Unlock Premium Member

Question 77

Report Export Collapse

Your company is setting up an enterprise business intelligence platform. You need to limit data access between many different teams while following the Google-recommended approach. What should you do first?

Become a Premium Member for full access
  Unlock Premium Member

Question 78

Report Export Collapse

Your company is adopting BigQuery as their data warehouse platform. Your team has experienced Python developers. You need to recommend a fully-managed tool to build batch ETL processes that extract data from various source systems, transform the data using a variety of Google Cloud services, and load the transformed data into BigQuery. You want this tool to leverage your team's Python skills. What should you do?

Become a Premium Member for full access
  Unlock Premium Member

Question 79

Report Export Collapse

You need to create a data pipeline for a new application. Your application will stream data that needs to be enriched and cleaned. Eventually, the data will be used to train machine learning models. You need to determine the appropriate data manipulation methodology and which Google Cloud services to use in this pipeline. What should you choose?

Become a Premium Member for full access
  Unlock Premium Member

Question 80

Report Export Collapse

You need to transfer approximately 300 TB of data from your company's on-premises data center to Cloud Storage. You have 100 Mbps internet bandwidth, and the transfer needs to be completed as quickly as possible. What should you do?

Become a Premium Member for full access
  Unlock Premium Member
Total 107 questions
Go to page: of 11
Search

Related questions