ExamGecko
Question list
Search
Search

List of questions

Search

Related questions











Question 281 - Professional Machine Learning Engineer discussion

Report
Export

You are implementing a batch inference ML pipeline in Google Cloud. The model was developed by using TensorFlow and is stored in SavedModel format in Cloud Storage. You need to apply the model to a historical dataset that is stored in a BigQuery table. You want to perform inference with minimal effort. What should you do?

A.

Import the TensorFlow model by using the create model statement in BigQuery ML. Apply the historical data to the TensorFlow model.

Answers
A.

Import the TensorFlow model by using the create model statement in BigQuery ML. Apply the historical data to the TensorFlow model.

B.

Export the historical data to Cloud Storage in Avro format. Configure a Vertex Al batch prediction job to generate predictions for the exported data.

Answers
B.

Export the historical data to Cloud Storage in Avro format. Configure a Vertex Al batch prediction job to generate predictions for the exported data.

C.

Export the historical data to Cloud Storage in CSV format. Configure a Vertex Al batch prediction job to generate predictions for the exported data.

Answers
C.

Export the historical data to Cloud Storage in CSV format. Configure a Vertex Al batch prediction job to generate predictions for the exported data.

D.

Configure and deploy a Vertex Al endpoint. Use the endpoint to get predictions from the historical data inBigQuery.

Answers
D.

Configure and deploy a Vertex Al endpoint. Use the endpoint to get predictions from the historical data inBigQuery.

Suggested answer: B

Explanation:

Vertex AI batch prediction is the most appropriate and efficient way to apply a pre-trained model like TensorFlow's SavedModel to a large dataset, especially for batch processing.

The Vertex AI batch prediction job works by exporting your dataset (in this case, historical data from BigQuery) to a suitable format (like Avro or CSV) and then processing it in Cloud Storage where the model is stored.

Avro format is recommended for large datasets as it is highly efficient for data storage and is optimized for read/write operations in Google Cloud, which is why option B is correct.

Option A suggests using BigQuery ML for inference, but it does not support running arbitrary TensorFlow models directly within BigQuery ML. Hence, BigQuery ML is not a valid option for this particular task.

Option C (exporting to CSV) is a valid alternative but is less efficient compared to Avro in terms of performance.


asked 07/11/2024
Stefano Humphries
40 questions
User
Your answer:
0 comments
Sorted by

Leave a comment first