ExamGecko
Question list
Search
Search

List of questions

Search

Question 18 - H13-311_V3.5 discussion

Report
Export

Which of the following are common gradient descent methods?

A.
Batch gradient descent (BGD)
Answers
A.
Batch gradient descent (BGD)
B.
Mini-batch gradient descent (MBGD)
Answers
B.
Mini-batch gradient descent (MBGD)
C.
Multi-dimensional gradient descent (MDGD)
Answers
C.
Multi-dimensional gradient descent (MDGD)
D.
Stochastic gradient descent (SGD)
Answers
D.
Stochastic gradient descent (SGD)
Suggested answer: A, B, D

Explanation:

The gradient descent method is a core optimization technique in machine learning, particularly for neural networks and deep learning models. The common gradient descent methods include:

Batch Gradient Descent (BGD): Updates the model parameters after computing the gradients from the entire dataset.

Mini-batch Gradient Descent (MBGD): Updates the model parameters using a small batch of data, combining the benefits of both batch and stochastic gradient descent.

Stochastic Gradient Descent (SGD): Updates the model parameters for each individual data point, leading to faster but noisier updates.

Multi-dimensional gradient descent is not a recognized method in AI or machine learning.

asked 26/09/2024
Darin Ambrose
40 questions
User
Your answer:
0 comments
Sorted by

Leave a comment first