ExamGecko
Question list
Search
Search

Question 48 - DSA-C02 discussion

Report
Export

What is the risk with tuning hyper-parameters using a test dataset?

A.
Model will overfit the test set
Answers
A.
Model will overfit the test set
B.
Model will underfit the test set
Answers
B.
Model will underfit the test set
C.
Model will overfit the training set
Answers
C.
Model will overfit the training set
D.
Model will perform balanced
Answers
D.
Model will perform balanced
Suggested answer: A

Explanation:

The model will not generalize well to unseen data because it overfits the test set. Tuning model hyper-parameters to a test set means that the hyper-parameters may overfit to that test set. If the same test set is used to estimate performance, it will produce an overestimate. The test set should be used only for testing, not for parameter tuning.

Using a separate validation set for tuning and test set for measuring performance provides unbiased, realistic measurement of performance.

What are hyper-parameters?

Hyper-parameters are parameters whose values control the learning process and determine the values of model parameters that a learning algorithm ends up learning. We can't calculate their values from the data.

Example: Number of clusters in clustering, number of hidden layers in a neural network, and depth of a tree are some of the examples of hyper-parameters.

What is the hyper-parameter tuning?

Hyper-parameter tuning is the process of choosing the right combination of hyper-parameters that maximizes the model performance. It works by running multiple trials in a single training process. Each trial is a complete execution of your training application with values for your chosen hyper-parameters, set within the limits you specify. This process once finished will give you the set of hyper-parameter values that are best suited for the model to give optimal results.

asked 23/09/2024
Andrew Staton
26 questions
User
Your answer:
0 comments
Sorted by

Leave a comment first