ExamGecko
Question list
Search
Search

List of questions

Search

Question 33 - H13-311_V3.5 discussion

Report
Export

The global gradient descent, stochastic gradient descent, and batch gradient descent algorithms are gradient descent algorithms. Which of the following is true about these algorithms?

A.
The batch gradient algorithm can solve the problem of local minimum value.
Answers
A.
The batch gradient algorithm can solve the problem of local minimum value.
B.
The global gradient algorithm can find the minimum value of the loss function.
Answers
B.
The global gradient algorithm can find the minimum value of the loss function.
C.
The stochastic gradient algorithm can find the minimum value of the loss function.
Answers
C.
The stochastic gradient algorithm can find the minimum value of the loss function.
D.
The convergence process of the global gradient algorithm is time-consuming.
Answers
D.
The convergence process of the global gradient algorithm is time-consuming.
Suggested answer: D

Explanation:

The global gradient descent algorithm evaluates the gradient over the entire dataset before each update, leading to accurate but slow convergence, especially for large datasets. In contrast, stochastic gradient descent updates the model parameters more frequently, which allows for faster convergence but with noisier updates. While batch gradient descent updates the parameters based on smaller batches of data, none of these algorithms can fully guarantee finding the global minimum in non-convex problems, where local minima may exist.

asked 26/09/2024
de jong arjen
45 questions
User
Your answer:
0 comments
Sorted by

Leave a comment first