ExamGecko
Question list
Search
Search

List of questions

Search

Related questions











Question 227 - Professional Machine Learning Engineer discussion

Report
Export

You recently trained an XGBoost model on tabular data You plan to expose the model for internal use as an HTTP microservice After deployment you expect a small number of incoming requests. You want to productionize the model with the least amount of effort and latency. What should you do?

A.
Deploy the model to BigQuery ML by using CREATE model with the BOOSTED-THREE-REGRESSOR statement and invoke the BigQuery API from the microservice.
Answers
A.
Deploy the model to BigQuery ML by using CREATE model with the BOOSTED-THREE-REGRESSOR statement and invoke the BigQuery API from the microservice.
B.
Build a Flask-based app Package the app in a custom container on Vertex Al and deploy it to Vertex Al Endpoints.
Answers
B.
Build a Flask-based app Package the app in a custom container on Vertex Al and deploy it to Vertex Al Endpoints.
C.
Build a Flask-based app Package the app in a Docker image and deploy it to Google Kubernetes Engine in Autopilot mode.
Answers
C.
Build a Flask-based app Package the app in a Docker image and deploy it to Google Kubernetes Engine in Autopilot mode.
D.
Use a prebuilt XGBoost Vertex container to create a model and deploy it to Vertex Al Endpoints.
Answers
D.
Use a prebuilt XGBoost Vertex container to create a model and deploy it to Vertex Al Endpoints.
Suggested answer: D

Explanation:

XGBoost is a popular open-source library that provides a scalable and efficient implementation of gradient boosted trees. You can use XGBoost to train a classification or regression model on tabular data. You can also use Vertex AI to productionize the model and expose it for internal use as an HTTP microservice. Vertex AI is a service that allows you to create and train ML models using Google Cloud technologies. You can use a prebuilt XGBoost Vertex container to create a model and deploy it to Vertex AI Endpoints. A prebuilt Vertex container is a container image that contains the dependencies and libraries needed to run a specific ML framework, such as XGBoost. You can use a prebuilt Vertex container to simplify the model creation and deployment process, without having to build your own custom container. Vertex AI Endpoints is a service that allows you to serve your ML models online and scale them automatically. You can use Vertex AI Endpoints to deploy the model from the prebuilt Vertex container and expose it as an HTTP microservice. You can also configure the endpoint to handle a small number of incoming requests, and optimize the latency and cost of serving the model. By using a prebuilt XGBoost Vertex container and Vertex AI Endpoints, you can productionize the model with the least amount of effort and latency.Reference:

XGBoost documentation

Vertex AI documentation

Prebuilt Vertex container documentation

Vertex AI Endpoints documentation

Preparing for Google Cloud Certification: Machine Learning Engineer Professional Certificate

asked 18/09/2024
Himal Rai
39 questions
User
Your answer:
0 comments
Sorted by

Leave a comment first