ExamGecko
Question list
Search
Search

List of questions

Search

Related questions











Question 197 - Professional Machine Learning Engineer discussion

Report
Export

You work at a bank. You need to develop a credit risk model to support loan application decisions You decide to implement the model by using a neural network in TensorFlow Due to regulatory requirements, you need to be able to explain the models predictions based on its features When the model is deployed, you also want to monitor the model's performance overtime You decided to use Vertex Al for both model development and deployment What should you do?

A.
Use Vertex Explainable Al with the sampled Shapley method, and enable Vertex Al Model Monitoring to check for feature distribution drift.
Answers
A.
Use Vertex Explainable Al with the sampled Shapley method, and enable Vertex Al Model Monitoring to check for feature distribution drift.
B.
Use Vertex Explainable Al with the sampled Shapley method, and enable Vertex Al Model Monitoring to check for feature distribution skew.
Answers
B.
Use Vertex Explainable Al with the sampled Shapley method, and enable Vertex Al Model Monitoring to check for feature distribution skew.
C.
Use Vertex Explainable Al with the XRAI method, and enable Vertex Al Model Monitoring to check for feature distribution drift.
Answers
C.
Use Vertex Explainable Al with the XRAI method, and enable Vertex Al Model Monitoring to check for feature distribution drift.
D.
Use Vertex Explainable Al with the XRAI method and enable Vertex Al Model Monitoring to check for feature distribution skew.
Answers
D.
Use Vertex Explainable Al with the XRAI method and enable Vertex Al Model Monitoring to check for feature distribution skew.
Suggested answer: A

Explanation:

To develop a credit risk model that meets the regulatory requirements and can be monitored over time, you should follow these steps:

Use Vertex Explainable AI with the sampled Shapley method.Vertex Explainable AI is a service that provides feature attributions for machine learning models, which can help you understand how each feature contributes to the prediction1.The sampled Shapley method is a technique that estimates the Shapley values for each feature, which are based on the marginal contribution of each feature to the prediction across all possible feature subsets2.The sampled Shapley method is suitable for neural networks and other complex models, as it can capture the non-linear and interaction effects of the features3.

Enable Vertex AI Model Monitoring to check for feature distribution drift.Vertex AI Model Monitoring is a service that helps you track and manage the performance and quality of your deployed models over time4. Feature distribution drift is a type of data drift that occurs when the distribution of the input features changes significantly from the training data, which can affect the model accuracy and reliability. By checking for feature distribution drift, you can detect when your model needs to be retrained or updated with new data.

1: Introduction to Vertex Explainable AI | Vertex AI | Google Cloud

2: Shapley value - Wikipedia

3: Explainable AI: Interpreting, Explaining and Visualizing Deep Learning

4: Introduction to Vertex AI Model Monitoring | Vertex AI | Google Cloud

[5]: Monitor models for data drift | Vertex AI | Google Cloud

asked 18/09/2024
Jeremiah Hutchins
45 questions
User
Your answer:
0 comments
Sorted by

Leave a comment first