List of questions
Related questions
Question 133 - Professional Data Engineer discussion
You are implementing security best practices on your data pipeline. Currently, you are manually executing jobs as the Project Owner. You want to automate these jobs by taking nightly batch files containing non-public information from
Google Cloud Storage, processing them with a Spark Scala job on a Google Cloud Dataproc cluster, and depositing the results into Google BigQuery.
How should you securely run this workload?
A.
Restrict the Google Cloud Storage bucket so only you can see the files
B.
Grant the Project Owner role to a service account, and run the job with it
C.
Use a service account with the ability to read the batch files and to write to BigQuery
D.
Use a user account with the Project Viewer role on the Cloud Dataproc cluster to read the batch files and write to BigQuery
Your answer:
0 comments
Sorted by
Leave a comment first