List of questions
Related questions
Question 332 - Professional Data Engineer discussion
You have a data pipeline with a Dataflow job that aggregates and writes time series metrics to Bigtable. You notice that data is slow to update in Bigtable. This data feeds a dashboard used by thousands of users across the organization. You need to support additional concurrent users and reduce the amount of time required to write the data. What should you do?
Choose 2 answers
A.
Configure your Dataflow pipeline to use local execution.
B.
Modify your Dataflow pipeline lo use the Flatten transform before writing to Bigtable.
C.
Modify your Dataflow pipeline to use the CoGrcupByKey transform before writing to Bigtable.
D.
Increase the maximum number of Dataflow workers by setting maxNumWorkers in PipelineOptions.
E.
Increase the number of nodes in the Bigtable cluster.
Your answer:
0 comments
Sorted by
Leave a comment first