ExamGecko
Home / Google / Professional Data Engineer / List of questions
Ask Question

Google Professional Data Engineer Practice Test - Questions Answers, Page 15

Add to Whishlist

List of questions

Question 141

Report Export Collapse

You launched a new gaming app almost three years ago. You have been uploading log files from the previous day to a separate Google BigQuery table with the table name format LOGS_yyyymmdd. You have been using table wildcard functions to generate daily and monthly reports for all time ranges.

Recently, you discovered that some queries that cover long date ranges are exceeding the limit of 1,000 tables and failing. How can you resolve this issue?

Become a Premium Member for full access
  Unlock Premium Member

Question 142

Report Export Collapse

Your analytics team wants to build a simple statistical model to determine which customers are most likely to work with your company again, based on a few different metrics. They want to run the model on Apache Spark, using data housed in Google Cloud Storage, and you have recommended using Google Cloud Dataproc to execute this job. Testing has shown that this workload can run in approximately 30 minutes on a 15-node cluster, outputting the results into Google

BigQuery. The plan is to run this workload weekly. How should you optimize the cluster for cost?

Become a Premium Member for full access
  Unlock Premium Member

Question 143

Report Export Collapse

Your company receives both batch- and stream-based event dat a. You want to process the data using Google Cloud Dataflow over a predictable time period.

However, you realize that in some instances data can arrive late or out of order. How should you design your Cloud Dataflow pipeline to handle data that is late or out of order?

Become a Premium Member for full access
  Unlock Premium Member

Question 144

Report Export Collapse

You have some data, which is shown in the graphic below. The two dimensions are X and Y, and the shade of each dot represents what class it is. You want to classify this data accurately using a linear algorithm.

Google Professional Data Engineer image Question 144 29740 09182024191422000000

To do this you need to add a synthetic feature. What should the value of that feature be?

Become a Premium Member for full access
  Unlock Premium Member

Question 145

Report Export Collapse

You are integrating one of your internal IT applications and Google BigQuery, so users can query BigQuery from the application's interface. You do not want individual users to authenticate to BigQuery and you do not want to give them access to the dataset. You need to securely access BigQuery from your IT application.

What should you do?

Become a Premium Member for full access
  Unlock Premium Member

Question 146

Report Export Collapse

You set up a streaming data insert into a Redis cluster via a Kafka cluster. Both clusters are running on Compute Engine instances. You need to encrypt data at rest with encryption keys that you can create, rotate, and destroy as needed.

What should you do?

Become a Premium Member for full access
  Unlock Premium Member

Question 147

Report Export Collapse

You are developing an application that uses a recommendation engine on Google Cloud. Your solution should display new videos to customers based on past views. Your solution needs to generate labels for the entities in videos that the customer has viewed. Your design must be able to provide very fast filtering suggestions based on data from other customer preferences on several TB of dat a. What should you do?

Become a Premium Member for full access
  Unlock Premium Member

Question 148

Report Export Collapse

You are selecting services to write and transform JSON messages from Cloud Pub/Sub to BigQuery for a data pipeline on Google Cloud. You want to minimize service costs. You also want to monitor and accommodate input data volume that will vary in size with minimal manual intervention. What should you do?

Become a Premium Member for full access
  Unlock Premium Member

Question 149

Report Export Collapse

Your infrastructure includes a set of YouTube channels. You have been tasked with creating a process for sending the YouTube channel data to Google Cloud for analysis. You want to design a solution that allows your world-wide marketing teams to perform ANSI SQL and other types of analysis on up-todate YouTube channels log dat a. How should you set up the log data transfer into Google Cloud?

Become a Premium Member for full access
  Unlock Premium Member

Question 150

Report Export Collapse

You are designing storage for very large text files for a data pipeline on Google Cloud. You want to support ANSI SQL queries. You also want to support compression and parallel load from the input locations using Google recommended practices. What should you do?

Become a Premium Member for full access
  Unlock Premium Member
Total 377 questions
Go to page: of 38
Search

Related questions