ExamGecko
Home / Google / Associate Data Practitioner / List of questions
Ask Question

Google Associate Data Practitioner Practice Test - Questions Answers, Page 10

Add to Whishlist

List of questions

Question 91

Report Export Collapse

Your company uses Looker as its primary business intelligence platform. You want to use LookML to visualize the profit margin for each of your company's products in your Looker Explores and dashboards. You need to implement a solution quickly and efficiently. What should you do?

Become a Premium Member for full access
  Unlock Premium Member

Question 92

Report Export Collapse

Your company has an on-premises file server with 5 TB of data that needs to be migrated to Google Cloud. The network operations team has mandated that you can only use up to 250 Mbps of the total available bandwidth for the migration. You need to perform an online migration to Cloud Storage. What should you do?

Become a Premium Member for full access
  Unlock Premium Member

Question 93

Report Export Collapse

You work for a gaming company that collects real-time player activity data. This data is streamed into Pub/Sub and needs to be processed and loaded into BigQuery for analysis. The processing involves filtering, enriching, and aggregating the data before loading it into partitioned BigQuery tables. You need to design a pipeline that ensures low latency and high throughput while following a Google-recommended approach. What should you do?

Become a Premium Member for full access
  Unlock Premium Member

Question 94

Report Export Collapse

Your retail company wants to predict customer churn using historical purchase data stored in BigQuery. The dataset includes customer demographics, purchase history, and a label indicating whether the customer churned or not. You want to build a machine learning model to identify customers at risk of churning. You need to create and train a logistic regression model for predicting customer churn, using the customer_data table with the churned column as the target label. Which BigQuery ML query should you use?

Become a Premium Member for full access
  Unlock Premium Member

Question 95

Report Export Collapse

You work for a healthcare company. You have a daily ETL pipeline that extracts patient data from a legacy system, transforms it, and loads it into BigQuery for analysis. The pipeline currently runs manually using a shell script. You want to automate this process and add monitoring to ensure pipeline observability and troubleshooting insights. You want one centralized solution, using open-source tooling, without rewriting the ETL code. What should you do?

Become a Premium Member for full access
  Unlock Premium Member

Question 96

Report Export Collapse

You manage data at an ecommerce company. You have a Dataflow pipeline that processes order data from Pub/Sub, enriches the data with product information from Bigtable, and writes the processed data to BigQuery for analysis. The pipeline runs continuously and processes thousands of orders every minute. You need to monitor the pipeline's performance and be alerted if errors occur. What should you do?

Become a Premium Member for full access
  Unlock Premium Member

Question 97

Report Export Collapse

Your team uses Google Sheets to track budget data that is updated daily. The team wants to compare budget data against actual cost data, which is stored in a BigQuery table. You need to create a solution that calculates the difference between each day's budget and actual costs. You want to ensure that your team has access to daily-updated results in Google Sheets. What should you do?

Become a Premium Member for full access
  Unlock Premium Member

Question 98

Report Export Collapse

You have a Cloud SQL for PostgreSQL database that stores sensitive historical financial data. You need to ensure that the data is uncorrupted and recoverable in the event that the primary region is destroyed. The data is valuable, so you need to prioritize recovery point objective (RPO) over recovery time objective (RTO). You want to recommend a solution that minimizes latency for primary read and write operations. What should you do?

Become a Premium Member for full access
  Unlock Premium Member

Question 99

Report Export Collapse

You are building a batch data pipeline to process 100 GB of structured data from multiple sources for daily reporting. You need to transform and standardize the data prior to loading the data to ensure that it is stored in a single dataset. You want to use a low-code solution that can be easily built and managed. What should you do?

Become a Premium Member for full access
  Unlock Premium Member

Question 100

Report Export Collapse

You are working on a project that requires analyzing daily social media data. You have 100 GB of JSON formatted data stored in Cloud Storage that keeps growing.

You need to transform and load this data into BigQuery for analysis. You want to follow the Google-recommended approach. What should you do?

Become a Premium Member for full access
  Unlock Premium Member
Total 107 questions
Go to page: of 11
Search

Related questions