ExamGecko
Question list
Search
Search

List of questions

Search

Question 31 - H13-311_V3.5 discussion

Report
Export

The derivative of the Rectified Linear Unit (ReLU) activation function in the positive interval is always:

A.
0
Answers
A.
0
B.
0.5
Answers
B.
0.5
C.
1
Answers
C.
1
D.
Variable
Answers
D.
Variable
Suggested answer: C

Explanation:

The Rectified Linear Unit (ReLU) activation function is defined as f(x)=max(0,x)f(x) = \max(0, x)f(x)=max(0,x). In the positive interval, where x>0x > 0x>0, the derivative of ReLU is always 1. This makes ReLU popular for deep learning networks because it helps avoid the vanishing gradient problem during backpropagation, ensuring efficient gradient flow.

asked 26/09/2024
Arkadiusz Skopinski
40 questions
User
Your answer:
0 comments
Sorted by

Leave a comment first