ExamGecko
Question list
Search
Search

List of questions

Search

Related questions











Question 77 - MLS-C01 discussion

Report
Export

A web-based company wants to improve its conversion rate on its landing page Using a large historical dataset of customer visits, the company has repeatedly trained a multi-class deep learning network algorithm on Amazon SageMaker However there is an overfitting problem training data shows 90% accuracy in predictions, while test data shows 70% accuracy only

The company needs to boost the generalization of its model before deploying it into production to maximize conversions of visits to purchases

Which action is recommended to provide the HIGHEST accuracy model for the company's test and validation data?

A.
Increase the randomization of training data in the mini-batches used in training.
Answers
A.
Increase the randomization of training data in the mini-batches used in training.
B.
Allocate a higher proportion of the overall data to the training dataset
Answers
B.
Allocate a higher proportion of the overall data to the training dataset
C.
Apply L1 or L2 regularization and dropouts to the training.
Answers
C.
Apply L1 or L2 regularization and dropouts to the training.
D.
Reduce the number of layers and units (or neurons) from the deep learning network.
Answers
D.
Reduce the number of layers and units (or neurons) from the deep learning network.
Suggested answer: C

Explanation:

Regularization and dropouts are techniques that can help reduce overfitting in deep learning models. Overfitting occurs when the model learns too much from the training data and fails to generalize well to new data. Regularization adds a penalty term to the loss function that penalizes the model for having large or complex weights. This prevents the model from memorizing the noise or irrelevant features in the training data. L1 and L2 are two types of regularization that differ in how they calculate the penalty term. L1 regularization uses the absolute value of the weights, while L2 regularization uses the square of the weights. Dropouts are another technique that randomly drops out some units or neurons from the network during training. This creates a thinner network that is less prone to overfitting. Dropouts also act as a form of ensemble learning, where multiple sub-models are combined to produce a better prediction. By applying regularization and dropouts to the training, the web-based company can improve the generalization and accuracy of its deep learning model on the test and validation data.References:

Regularization: A video that explains the concept and benefits of regularization in deep learning.

Dropout: A video that demonstrates how dropout works and why it helps reduce overfitting.

asked 16/09/2024
Ntombifuthi Shabangu
28 questions
User
Your answer:
0 comments
Sorted by

Leave a comment first