For Week 8 Answers Join Telegram Group👇
Note: We are trying to give our best so please share with your friends also.NPTEL Introduction To Machine Learning – IITKGP Week 7 Assignment 7 Answer 2022 :-
Q1. Which of the following option is/are correct regarding the benefits of ensemble model?
- Better performance
- More generalized model
- Better interpretability
Q.2. In AdaBoost, we give more weights to points having been misclassified in previous iterations. Now, if we introduced a limit or cap on the weight that any point can take (for example, say we introduce a restriction that prevents any point’s weight from exceeding a value of 10). Which among the following would be an effect of such a modification?
- We may observe the performance of the classifier reduce as the number of stages increase.
- It makes the final classifier robust to outliers.
- It may result in lower overall performance.
- None of these.
For Week 8 Assignment Answers Join Our Group👇
Q.3. Which among the following are some of the differences between bagging and boosting?
- In bagging we use the same classification algorithm for training on each sample of the data, whereas in boosting, we use different classification algorithms on the different training data samples.
- Bagging is easy to parallelize whereas boosting is inherently a sequential process.
- In bagging we typically use sampling with replacement whereas in boosting, we typically use weighted sampling techniques.
- In comparison with the performance of a base classifier on a particular dataset, bagging will generally not increase the error whereas as boosting may leadto an increase in the error.
Q.4. What is the VC-dimension of the class of sphere in a 3-dimensional plane?
- 3
- 4
- 5
- 6
Q.5. Considering the AdaBoost algorithm, which among the following statements is true?
- In each stage, we try to train a classifier which makes accurate predictions on any subset of the data points where the subset size is at least half the size of the data set.
- In each stage, we try to train a classifier which makes accurate predictions on a subset of the data points where the subset contains more of the data points which were misclassified in earlier stages.
- The weight assigned to an individual classifier depends upon the number of data points correctly classified by the classifier.
- The weight assigned to an individual classifier depends upon the weighted sum error of misclassified points for that classifier.
Q.6. Suppose the VC dimension of a hypothesis space is 6. Which of the following are true?
- At least one set of 6 points can be shattered by the hypothesis space.
- Two sets of 6 points can be shattered by the hypothesis space.
- All sets of 6 points can be shattered by the hypothesis space.
- No set of 7 points can be shattered by the hypothesis space.
Q.7. Ensembles will yield bad results when there is a significant diversity among the models. Write True or False.
- True
- False
NPTEL Introduction To Machine Learning – IITKGP Week 8 Answers Join Our Group👇
Q.8. Which of the following algorithms are not an ensemble learning algorithm?
- Random Forest
- Adaboost
- Gradient Boosting
- Decision Tress
Q.9. Which of the following can be true for selecting base learners for an ensemble?
- Different learners can come from same algorithm with different hyper parameters
- Different learners can come from different algorithms
- Different learners can come from different training spaces
- All of the above
Q.10. Generally, an ensemble method works better, if the individual base models have ___
Note: Individual models have accuracy greater than 50%- Less correlation among predictions
- High correlation among predictions
- Correlation does not have an impact on the ensemble output
- None of the above.
For Any Changes In Answers Join this Group👇
This answer is provided by us only for discussion purpose if any answer will be getting wrong don’t blame us. If any doubt or suggestions regarding any question kindly comment. The solution is provided by Chase2learn. This tutorial is only for Discussion and Learning purpose.About NPTEL Introduction To Machine Learning – IITKGP Course:
This course provides a concise introduction to the fundamental concepts in machine learning and popular machine learning algorithms. We will cover the standard and most popular supervised learning algorithms including linear regression, logistic regression, decision trees, k-nearest neighbour, an introduction to Bayesian learning and the naïve Bayes algorithm, support vector machines and kernels and neural networks with an introduction to Deep Learning.If you have not registered for exam kindly register Through https://examform.nptel.ac.in/COURSE LAYOUTThe course structure and content covers, over a period of 8 weeks:- Week 1: Introduction: Basic definitions, types of learning, hypothesis space and inductive bias, evaluation, cross-validation
- Week 2: Linear regression, Decision trees, overfitting
- Week 3: Instance based learning, Feature reduction, Collaborative filtering based recommendation
- Week 4: Probability and Bayes learning
- Week 5: Logistic Regression, Support Vector Machine, Kernel function and Kernel SVM
- Week 6: Neural network: Perceptron, multilayer network, backpropagation, introduction to deep neural network
- Week 7: Computational learning theory, PAC learning model, Sample complexity, VC Dimension, Ensemble learning
- Week 8: Clustering: k-means, adaptive hierarchical clustering, Gaussian mixture model