ExamGecko
Question list
Search
Search

List of questions

Search

Related questions











Question 15 - Professional Machine Learning Engineer discussion

Report
Export

During batch training of a neural network, you notice that there is an oscillation in the loss. How should you adjust your model to ensure that it converges?

A.
Increase the size of the training batch
Answers
A.
Increase the size of the training batch
B.
Decrease the size of the training batch
Answers
B.
Decrease the size of the training batch
C.
Increase the learning rate hyperparameter
Answers
C.
Increase the learning rate hyperparameter
D.
Decrease the learning rate hyperparameter
Answers
D.
Decrease the learning rate hyperparameter
Suggested answer: D

Explanation:

Oscillation in the loss during batch training of a neural network means that the model is overshooting the optimal point of the loss function and bouncing back and forth. This can prevent the model from converging to the minimum loss value. One of the main reasons for this phenomenon is that the learning rate hyperparameter, which controls the size of the steps that the model takes along the gradient, is too high. Therefore, decreasing the learning rate hyperparameter can help the model take smaller and more precise steps and avoid oscillation.This is a common technique to improve the stability and performance of neural network training12.

Interpreting Loss Curves

Is learning rate the only reason for training loss oscillation after few epochs?

asked 18/09/2024
Russell Bartsch
39 questions
User
Your answer:
0 comments
Sorted by

Leave a comment first