ExamGecko
Question list
Search
Search

List of questions

Search

Related questions











Question 170 - MLS-C01 discussion

Report
Export

A health care company is planning to use neural networks to classify their X-ray images into normal and abnormal classes. The labeled data is divided into a training set of 1,000 images and a test set of 200 images. The initial training of a neural network model with 50 hidden layers yielded 99% accuracy on the training set, but only 55% accuracy on the test set.

What changes should the Specialist consider to solve this issue? (Choose three.)

A.
Choose a higher number of layers
Answers
A.
Choose a higher number of layers
B.
Choose a lower number of layers
Answers
B.
Choose a lower number of layers
C.
Choose a smaller learning rate
Answers
C.
Choose a smaller learning rate
D.
Enable dropout
Answers
D.
Enable dropout
E.
Include all the images from the test set in the training set
Answers
E.
Include all the images from the test set in the training set
F.
Enable early stopping
Answers
F.
Enable early stopping
Suggested answer: B, D, F

Explanation:

The problem described in the question is a case of overfitting, where the neural network model performs well on the training data but poorly on the test data. This means that the model has learned the noise and specific patterns of the training data, but cannot generalize to new and unseen data. To solve this issue, the Specialist should consider the following changes:

Choose a lower number of layers: Reducing the number of layers can reduce the complexity and capacity of the neural network model, making it less prone to overfitting. A model with 50 hidden layers is likely too deep for the given data size and task. A simpler model with fewer layers can learn the essential features of the data without memorizing the noise.

Enable dropout: Dropout is a regularization technique that randomly drops out some units in the neural network during training. This prevents the units from co-adapting too much and forces the model to learn more robust features. Dropout can improve the generalization and test performance of the model by reducing overfitting.

Enable early stopping: Early stopping is another regularization technique that monitors the validation error during training and stops the training process when the validation error stops decreasing or starts increasing. This prevents the model from overtraining on the training data and reduces overfitting.

References:

Deep Learning - Machine Learning Lens

How to Avoid Overfitting in Deep Learning Neural Networks

How to Identify Overfitting Machine Learning Models in Scikit-Learn

asked 16/09/2024
Kevin Suckiel
48 questions
User
Your answer:
0 comments
Sorted by

Leave a comment first