ExamGecko
Question list
Search
Search

List of questions

Search

Related questions











Question 165 - Professional Machine Learning Engineer discussion

Report
Export

You work for a retail company. You have been asked to develop a model to predict whether a customer will purchase a product on a given day. Your team has processed the company's sales data, and created a table with the following rows:

* Customer_id

* Product_id

* Date

* Days_since_last_purchase (measured in days)

* Average_purchase_frequency (measured in 1/days)

* Purchase (binary class, if customer purchased product on the Date)

You need to interpret your models results for each individual prediction. What should you do?

A.
Create a BigQuery table Use BigQuery ML to build a boosted tree classifier Inspect the partition rules of the trees to understand how each prediction flows through the trees.
Answers
A.
Create a BigQuery table Use BigQuery ML to build a boosted tree classifier Inspect the partition rules of the trees to understand how each prediction flows through the trees.
B.
Create a Vertex Al tabular dataset Train an AutoML model to predict customer purchases Deploy the model to a Vertex Al endpoint and enable feature attributions Use the 'explain' method to get feature attribution values for each individual prediction.
Answers
B.
Create a Vertex Al tabular dataset Train an AutoML model to predict customer purchases Deploy the model to a Vertex Al endpoint and enable feature attributions Use the 'explain' method to get feature attribution values for each individual prediction.
C.
Create a BigQuery table Use BigQuery ML to build a logistic regression classification model Use the values of the coefficients of the model to interpret the feature importance with higher values corresponding to more importance.
Answers
C.
Create a BigQuery table Use BigQuery ML to build a logistic regression classification model Use the values of the coefficients of the model to interpret the feature importance with higher values corresponding to more importance.
D.
Create a Vertex Al tabular dataset Train an AutoML model to predict customer purchases Deploy the model to a Vertex Al endpoint. At each prediction enable L1 regularization to detect non-informative features.
Answers
D.
Create a Vertex Al tabular dataset Train an AutoML model to predict customer purchases Deploy the model to a Vertex Al endpoint. At each prediction enable L1 regularization to detect non-informative features.
Suggested answer: B

Explanation:

According to the official exam guide1, one of the skills assessed in the exam is to ''explain the predictions of a trained model''.Vertex AI provides feature attributions using Shapley Values, a cooperative game theory algorithm that assigns credit to each feature in a model for a particular outcome2. Feature attributions can help you understand how the model calculates the predictions and debug or optimize the model accordingly.You can use AutoML for Tabular Data to generate and query local feature attributions3. The other options are not relevant or optimal for this scenario.Reference:

Professional ML Engineer Exam Guide

Feature attributions for classification and regression

AutoML for Tabular Data

Google Professional Machine Learning Certification Exam 2023

Latest Google Professional Machine Learning Engineer Actual Free Exam Questions

asked 18/09/2024
Yuwadee Srisathan
40 questions
User
Your answer:
0 comments
Sorted by

Leave a comment first