List of questions
Related questions
Question 95 - Professional Machine Learning Engineer discussion
You need to execute a batch prediction on 100million records in a BigQuery table with a custom TensorFlow DNN regressor model, and then store the predicted results in a BigQuery table. You want to minimize the effort required to build this inference pipeline. What should you do?
A.
Import the TensorFlow model with BigQuery ML, and run the ml.predict function.
B.
Use the TensorFlow BigQuery reader to load the data, and use the BigQuery API to write the results to BigQuery.
C.
Create a Dataflow pipeline to convert the data in BigQuery to TFRecords. Run a batch inference on Vertex AI Prediction, and write the results to BigQuery.
D.
Load the TensorFlow SavedModel in a Dataflow pipeline. Use the BigQuery I/O connector with a custom function to perform the inference within the pipeline, and write the results to BigQuery.
Your answer:
0 comments
Sorted by
Leave a comment first