List of questions
Related questions
Question 34 - H13-311_V3.5 discussion
Sigmoid, tanh, and softsign activation functions cannot avoid vanishing gradient problems when the network is deep.
A.
TRUE
B.
FALSE
Your answer:
0 comments
Sorted by
Leave a comment first