Exam Professional Machine Learning Engineer topic 1 question 290 discussion - ExamTopics


AI Summary Hide AI Generated Summary

Question: Hyperparameter Tuning in Keras with Vertex AI

The question presents a scenario where two regression models (linear regression and a deep neural network) are developed in a Python module using Keras. The goal is to use Vertex AI's hypertuning service to find the best model architecture and hyperparameters across 100 trials, minimizing training loss and maximizing performance.

Options and Analysis

  • A. Run one hypertuning job with 100 trials. num_hidden_layers is a conditional hyperparameter based on training_method; learning_rate is non-conditional.
  • B. Run two separate hypertuning jobs (50 trials each). Compare performance on a validation set and select the best hyperparameters based on training loss.
  • C. Run one job (50 trials) to select the architecture based on training loss. Then, further hypertune the selected architecture for another 50 trials.
  • D. Run one job (100 trials). Both num_hidden_layers and learning_rate are conditional hyperparameters based on training_method.

The suggested answer is A. This approach is efficient, exploring the entire hyperparameter space within a single job, making effective use of the allocated budget. Options B and C split the trials, potentially missing better solutions. Option D, while potentially useful, may be less efficient and could cause issues with hyperparameter dependencies.

Suggested Answer: A

Sign in to unlock more AI features Sign in with Google

You developed a Python module by using Keras to train a regression model. You developed two model architectures, linear regression and deep neural network (DNN), within the same module. You are using the training_method argument to select one of the two methods, and you are using the learning_rate and num_hidden_layers arguments in the DNN. You plan to use Vertex AI's hypertuning service with a budget to perform 100 trials. You want to identify the model architecture and hyperparameter values that minimize training loss and maximize model performance. What should you do?

  • A. Run one hypertuning job for 100 trials. Set num_hidden_layers as a conditional hyperparameter based on its parent hyperparameter training_method, and set learning_rate as a non-conditional hyperparameter.
  • B. Run two separate hypertuning jobs, a linear regression job for 50 trials, and a DNN job for 50 trials. Compare their final performance on a common validation set, and select the set of hyperparameters with the least training loss.
  • C. Run one hypertuning job with training_method as the hyperparameter for 50 trials. Select the architecture with the lowest training loss, and further hypertune it and its corresponding hyperparameters tor 50 trials.
  • D. Run one hypertuning job for 100 trials. Set num_hidden_layers and learning_rate as conditional hyperparameters based on their parent hyperparameter training_method.
Show Suggested Answer Hide Answer
Suggested Answer: A πŸ—³οΈ

Was this article displayed correctly? Not happy with what you see?


Share this article with your
friends and colleagues.

Facebook



Share this article with your
friends and colleagues.

Facebook