List of questions
Related questions
Question 281 - Professional Machine Learning Engineer discussion
You are implementing a batch inference ML pipeline in Google Cloud. The model was developed by using TensorFlow and is stored in SavedModel format in Cloud Storage. You need to apply the model to a historical dataset that is stored in a BigQuery table. You want to perform inference with minimal effort. What should you do?
Import the TensorFlow model by using the create model statement in BigQuery ML. Apply the historical data to the TensorFlow model.
Export the historical data to Cloud Storage in Avro format. Configure a Vertex Al batch prediction job to generate predictions for the exported data.
Export the historical data to Cloud Storage in CSV format. Configure a Vertex Al batch prediction job to generate predictions for the exported data.
Configure and deploy a Vertex Al endpoint. Use the endpoint to get predictions from the historical data inBigQuery.
0 comments
Leave a comment first