improving deep neural networks hyperparameter tuning regularization and optimization week 1 quiz answers

Quiz - Practical aspects of Deep Learning

1. If you have 10,000 examples, how would you split the train/dev/test set? Choose the best option.

  • 33% train. 33% dev. 33% test
  • 98% train. 1% dev. 1% test
  • 60% train. 20% dev. 20% test

2. The dev and test set should:

  • Be identical to each other (same (x,y) pairs)
  • Come from the same distribution
  • Have the same number of examples
  • Come from different distributions

3. If your Neural Network model seems to have high variance, what of the following would be promising things to try?

  • Get more training data
  • Make the Neural Network deeper
  • Get more test data
  • Increase the number of units in each hidden layer
  • Add regularization

4. Working on a model to classify bananas and oranges your classifier gets a training set error of 0.1% and a dev set error of 11%. Which of the following two are true?

  • The model is overfitting the dev set.
  • The model is overfitting the train set.
  • The model has a high variance.
  • The model has a very high bias.

5. What is weight decay?

  • Gradual corruption of the weights in the neural network if it is trained on noisy data.
  • A regularization technique (such as L2 regularization) that results in gradient descent shrinking the weights on every iteration.
  • The process of gradually decreasing the learning rate during training.
  • A technique to avoid vanishing gradient by imposing a celling on the values or the weights.

6. The regularization hyperparameter must be set to zero during testing to avoid getting random results. True/False?

  • True
  • False

7. Which of the following are true about dropout?

  • In practice, it eliminates units of each layer with a probability of 1 – keep_prob.
  • In practice, it eliminates units of each layer with a probability of keep_prob.
  • It helps to reduce the bias of a model.
  • It helps to reduce overfitting.

8. Decreasing the parameter keep_prob from (say) 0.6 to 0.4 will likely cause the following:

  • Reducing the regularization effect.
  • Increasing the regularization effect.
  • Causing the neural network to have a higher variance.

9. Which of the following actions increase the regularization of a model? (Check all that apply)

  • Increase the value of keep_prob in dropout.
  • Decrease the value of the hyperparameter lambda.
  • Normalizing the data.
  • Increase the value of the hyperparameter lambda.
  • Make use of data augmentation.

10. Which of the following is the correct expression to normalize the input X?

Leave a Reply