List of questions
Related questions
Question 184 - Professional Data Engineer discussion
You have developed three data processing jobs. One executes a Cloud Dataflow pipeline that transforms data uploaded to Cloud Storage and writes results to BigQuery. The second ingests data from on-premises servers and uploads it to Cloud Storage. The third is a Cloud Dataflow pipeline that gets information from third-party data providers and uploads the information to Cloud Storage. You need to be able to schedule and monitor the execution of these three workflows and manually execute them when needed. What should you do?
A.
Create a Direct Acyclic Graph in Cloud Composer to schedule and monitor the jobs.
B.
Use Stackdriver Monitoring and set up an alert with a Webhook notification to trigger the jobs.
C.
Develop an App Engine application to schedule and request the status of the jobs using GCP API calls.
D.
Set up cron jobs in a Compute Engine instance to schedule and monitor the pipelines using GCP API calls.
Your answer:
0 comments
Sorted by
Leave a comment first