ExamGecko
Question list
Search
Search

List of questions

Search

Related questions











Question 45 - Professional Machine Learning Engineer discussion

Report
Export

You have trained a deep neural network model on Google Cloud. The model has low loss on the training data, but is performing worse on the validation data. You want the model to be resilient to overfitting. Which strategy should you use when retraining the model?

A.
Apply a dropout parameter of 0 2, and decrease the learning rate by a factor of 10
Answers
A.
Apply a dropout parameter of 0 2, and decrease the learning rate by a factor of 10
B.
Apply a L2 regularization parameter of 0.4, and decrease the learning rate by a factor of 10.
Answers
B.
Apply a L2 regularization parameter of 0.4, and decrease the learning rate by a factor of 10.
C.
Run a hyperparameter tuning job on Al Platform to optimize for the L2 regularization and dropout parameters
Answers
C.
Run a hyperparameter tuning job on Al Platform to optimize for the L2 regularization and dropout parameters
D.
Run a hyperparameter tuning job on Al Platform to optimize for the learning rate, and increase the number of neurons by a factor of 2.
Answers
D.
Run a hyperparameter tuning job on Al Platform to optimize for the learning rate, and increase the number of neurons by a factor of 2.
Suggested answer: C

Explanation:

Overfitting occurs when a model tries to fit the training data so closely that it does not generalize well to new data. Overfitting can be caused by having a model that is too complex for the data, such as having too many parameters or layers.Overfitting can lead to poor performance on the validation data, which reflects how the model will perform on unseen data1

To prevent overfitting, one strategy is to use regularization techniques that penalize the complexity of the model and encourage it to learn simpler patterns. Two common regularization techniques for deep neural networks are L2 regularization and dropout. L2 regularization adds a term to the loss function that is proportional to the squared magnitude of the model's weights. This term penalizes large weights and encourages the model to use smaller weights. Dropout randomly drops out some units in the network during training, which prevents co-adaptation of features and reduces the effective number of parameters.Both L2 regularization and dropout have hyperparameters that control the strength of the regularization effect23

Another strategy to prevent overfitting is to use hyperparameter tuning, which is the process of finding the optimal values for the parameters of the model that affect its performance. Hyperparameter tuning can help find the best combination of hyperparameters that minimize the validation loss and improve the generalization ability of the model. AI Platform provides a service for hyperparameter tuning that can run multiple trials in parallel and use different search algorithms to find the best solution.

Therefore, the best strategy to use when retraining the model is to run a hyperparameter tuning job on AI Platform to optimize for the L2 regularization and dropout parameters. This will allow the model to find the optimal balance between fitting the training data and generalizing to new data. The other options are not as effective, as they either use fixed values for the regularization parameters, which may not be optimal, or they do not address the issue of overfitting at all.

asked 18/09/2024
Paola Aguirre
39 questions
User
Your answer:
0 comments
Sorted by

Leave a comment first