ExamGecko
Question list
Search
Search

List of questions

Search

Related questions











Question 17 - Professional Data Engineer discussion

Report
Export

Your company is migrating their 30-node Apache Hadoop cluster to the cloud. They want to re-use Hadoop jobs they have already created and minimize the management of the cluster as much as possible. They also want to be able to persist data beyond the life of the cluster. What should you do?

A.
Create a Google Cloud Dataflow job to process the data.
Answers
A.
Create a Google Cloud Dataflow job to process the data.
B.
Create a Google Cloud Dataproc cluster that uses persistent disks for HDFS.
Answers
B.
Create a Google Cloud Dataproc cluster that uses persistent disks for HDFS.
C.
Create a Hadoop cluster on Google Compute Engine that uses persistent disks.
Answers
C.
Create a Hadoop cluster on Google Compute Engine that uses persistent disks.
D.
Create a Cloud Dataproc cluster that uses the Google Cloud Storage connector.
Answers
D.
Create a Cloud Dataproc cluster that uses the Google Cloud Storage connector.
E.
Create a Hadoop cluster on Google Compute Engine that uses Local SSD disks.
Answers
E.
Create a Hadoop cluster on Google Compute Engine that uses Local SSD disks.
Suggested answer: D
asked 18/09/2024
Mita Balija
41 questions
User
Your answer:
0 comments
Sorted by

Leave a comment first