ExamGecko
Question list
Search
Search

List of questions

Search

Question 30 - H13-311_V3.5 discussion

Report
Export

Which of the following is the activation function used in the hidden layers of the standard recurrent neural network (RNN) structure?

A.
ReLU
Answers
A.
ReLU
B.
Softmax
Answers
B.
Softmax
C.
Tanh
Answers
C.
Tanh
D.
Sigmoid
Answers
D.
Sigmoid
Suggested answer: C

Explanation:

In standard Recurrent Neural Networks (RNNs), the Tanh activation function is commonly used in the hidden layers. The Tanh function squashes input values to a range between -1 and 1, allowing the network to learn complex patterns over time by transforming the input data into non-linear patterns.

While other activation functions like Sigmoid can be used, Tanh is preferred in many RNNs for its wider range. ReLU is generally used in feed-forward networks, and Softmax is often applied in the output layer for classification problems.

HCIA AI

Deep Learning Overview: Describes the architecture of RNNs, highlighting the use of Tanh as the standard activation function.

AI Development Framework: Discusses the various activation functions used across different neural network architectures.

asked 26/09/2024
yusuf sivrikaya
38 questions
User
Your answer:
0 comments
Sorted by

Leave a comment first