List of questions
Related questions
Question 339 - Professional Data Engineer discussion
You have thousands of Apache Spark jobs running in your on-premises Apache Hadoop cluster. You want to migrate the jobs to Google Cloud. You want to use managed services to run your jobs instead of maintaining a long-lived Hadoop cluster yourself. You have a tight timeline and want to keep code changes to a minimum. What should you do?
A.
Copy your data to Compute Engine disks. Manage and run your jobs directly on those instances.
B.
Move your data to Cloud Storage. Run your jobs on Dataproc.
C.
Move your data to BigQuery. Convert your Spark scripts to a SQL-based processing approach.
D.
Rewrite your jobs in Apache Beam. Run your jobs in Dataflow.
Your answer:
0 comments
Sorted by
Leave a comment first