Huawei H13-311_V3.5 Practice Test - Questions Answers, Page 4

List of questions
Question 31

The derivative of the Rectified Linear Unit (ReLU) activation function in the positive interval is always:
Question 32

In a fully-connected structure, a hidden layer with 1000 neurons is used to process an image with the resolution of 100 x 100. Which of the following is the correct number of parameters?
Question 33

The global gradient descent, stochastic gradient descent, and batch gradient descent algorithms are gradient descent algorithms. Which of the following is true about these algorithms?
Question 34

Sigmoid, tanh, and softsign activation functions cannot avoid vanishing gradient problems when the network is deep.
Question 35

Single-layer perceptrons and logistic regression are linear classifiers that can only process linearly separable data.
Question 36

Nesterov is a variant of the momentum optimizer.
Question 37

Convolutional neural networks (CNNs) cannot be used to process text data.
Question 38

Which of the following activation functions may cause the vanishing gradient problem?
Question 39

Which of the following are use cases of generative adversarial networks?
Question 40

DRAG DROP
Match the input and output of a generative adversarial network (GAN).
Question