the nuts and bolts of machine learning coursera week 4 quiz answers
Test your knowledge: Additional supervised learning techniques
1. Tree-based learning is a type of unsupervised machine learning that performs classification and regression tasks.
- True
- False
2. Fill in the blank: Similar to a flow chart, a _____ is a classification model that represents various solutions available to solve a given problem based on the possible outcomes of each solution.
- decision tree
- Poisson distribution
- linear regression
- binary logistic regression
3. In a decision tree, which node is the location where the first decision is made?
- Leaf
- Branch
- Root
- Decision
4. In tree-based learning, how is a split determined?
- By the amount of leaves present
- By which variables and cut-off values offer the most predictive power
- By the level of balance present among the predictions made by the model
- By the number of decisions required before arriving at a final prediction
Test your knowledge: Tune tree-based models
5. Fill in the blank: The hyperparameter max depth is used to limit the depth of a decision tree, which is the number of levels between the _____ and the farthest node away from it.
- decision node
- root node
- leaf node
- first split
6. What tuning technique can a data professional use to confirm that a model achieves its intended purpose?
- Classifier
- Min samples leaf
- Grid search
- Decision tree
7. During model validation, the validation dataset must be combined with test data in order to function properly.
- True
- False
8. Fill in the blank: Cross validation involves splitting training data into different combinations of _____, on which the model is trained.
- banks
- parcels
- tiers
- folds
Shuffle Q/A 1
Test your knowledge: Bagging
9. Ensemble learning is most effective when the outputs are aggregated from models that follow the exact same methodology all using the same dataset.
- True
- False
10. What are some of the benefits of ensemble learning? Select all that apply.
- The predictions have lower variance than other standalone models.
- It requires few base learners trained on the same dataset.
- The predictions have less bias than other standalone models.
- It combines the results of many models to help make more reliable predictions.