ExamGecko
Question list
Search
Search

List of questions

Search

Related questions











Question 252 - MLS-C01 discussion

Report
Export

A data scientist is trying to improve the accuracy of a neural network classification model. The data scientist wants to run a large hyperparameter tuning job in Amazon SageMaker.

However, previous smaller tuning jobs on the same model often ran for several weeks. The ML specialist wants to reduce the computation time required to run the tuning job.

Which actions will MOST reduce the computation time for the hyperparameter tuning job? (Select TWO.)

A.
Use the Hyperband tuning strategy.
Answers
A.
Use the Hyperband tuning strategy.
B.
Increase the number of hyperparameters.
Answers
B.
Increase the number of hyperparameters.
C.
Set a lower value for the MaxNumberOfTrainingJobs parameter.
Answers
C.
Set a lower value for the MaxNumberOfTrainingJobs parameter.
D.
Use the grid search tuning strategy
Answers
D.
Use the grid search tuning strategy
E.
Set a lower value for the MaxParallelTrainingJobs parameter.
Answers
E.
Set a lower value for the MaxParallelTrainingJobs parameter.
Suggested answer: A, C

Explanation:

The Hyperband tuning strategy is a multi-fidelity based tuning strategy that dynamically reallocates resources to the most promising hyperparameter configurations. Hyperband uses both intermediate and final results of training jobs to stop under-performing jobs and reallocate epochs to well-utilized hyperparameter configurations. Hyperband can provide up to three times faster hyperparameter tuning compared to other strategies1. Setting a lower value for the MaxNumberOfTrainingJobs parameter can also reduce the computation time for the hyperparameter tuning job by limiting the number of training jobs that the tuning job can launch. This can help avoid unnecessary or redundant training jobs that do not improve the objective metric.

The other options are not effective ways to reduce the computation time for the hyperparameter tuning job. Increasing the number of hyperparameters will increase the complexity and dimensionality of the search space, which can result in longer computation time and lower performance. Using the grid search tuning strategy will also increase the computation time, as grid search methodically searches through every combination of hyperparameter values, which can be very expensive and inefficient for large search spaces. Setting a lower value for the MaxParallelTrainingJobs parameter will reduce the number of training jobs that can run in parallel, which can slow down the tuning process and increase the waiting time.

References:

* How Hyperparameter Tuning Works

* Best Practices for Hyperparameter Tuning

* HyperparameterTuner

* Amazon SageMaker Automatic Model Tuning now provides up to three times faster hyperparameter tuning with Hyperband

asked 16/09/2024
Marek Siwek
36 questions
User
Your answer:
0 comments
Sorted by

Leave a comment first