NPTEL Introduction to Machine Learning Week 4 Assignment 4 Answer: In this article, you will find NPTEL Introduction to Machine Learning Week 4 Assignment 4 Answer , Use “Ctrl+F” To Find Any Questions Answer. & For Mobile User, You Just Need To Click On Three dots In Your Browser & You Will Get A “Find” Option There. Use These Option to Get Any Random Questions Answer.
Note: We are trying to give our best please share this answer link with other students also.
NPTEL Introduction to Machine Learning Week 4 Assignment 4 Answer 2022 :-
- True
- False
Q.2. Consider a linear SVM trained with nn labeled points in R2R2 without slack penalties and resulting in k=2k=2 support vectors, where n>100n>100. By removing one labeled training point and retraining the SVM classifier, what is the maximum possible number of support vectors in the resulting solution?
- 1
- 2
- 3
- n-1
- n
Q.3. Which of the following are valid kernel functions?
- (1+<x,x’>)d(1+<x,x′>)d
- tanh(K1<x,x’>+K2)
- exp(−γ||x−x’||2)
Q.4. Consider the following dataset:
Which of these is not a support vector when using a Support Vector Classifier with a polynomial kernel with degree =3,C=1,=3,C=1, and gamma =0.1?=0.1?
(We recommend using sklearn to solve this question.)
- 3
- 1
- 9
- 10
Q.5. Consider an SVM with a second order polynomial kernel. Kernel 1 maps each input data point xx to K1(x)=[x x2]. Kernel 2 maps each input data point xx to K2(x)=[3x 3×2]K2(x). Assume the hyper-parameters are fixed. Which of the following option is true?
- The margin obtained using K2(x)K2(x) will be larger than the margin obtained using K1(x)K1(x).
- The margin obtained using K2(x)K2(x) will be smaller than the margin obtained using K1(x)K1(x).
- The margin obtained using K2(x)K2(x) will be the same as the margin obtained using K1(x)K1(x).
NPTEL All Weeks Assignment Solution: Click Here
Q.6. Train a Linear perceptron classifier on the modified iris dataset. Report the best classification accuracy for l1 and elasticnet penalty terms.
- 0.82, 0.64
- 0.90, 0.71
- 0.84, 0.82
- 0.78, 0.64
Q.7. Train an SVM classifier on the modified iris dataset. We encourage you to explore the impact of varying different hyperparameters of the model. Specifically, try different kernels and the associated hyperparameters. As part of the assignment, train models with the following set of hyperparameters poly, gamma=0.4gamma=0.4, one-vs-rest classifier, no-feature-normalization.
- 0.98
- 0.96
- 0.92
- 0.94
This answer is provided by us only for discussion purpose if any answer will be getting wrong don’t blame us. If any doubt or suggestions regarding any question kindly comment. The solution is provided by Chase2learn. This tutorial is only for Discussion and Learning purpose.
About NPTEL Introduction to Machine Learning Course:
With the increased availability of data from varied sources there has been increasing attention paid to the various data driven disciplines such as analytics and machine learning. In this course we intend to introduce some of the basic concepts of machine learning from a mathematically well motivated perspective. We will cover the different learning paradigms and some of the more popular algorithms and architectures used in each of these paradigms.
COURSE LAYOUT
The course structure and content covers, over a period of 12 weeks:
Week 0: Probability Theory, Linear Algebra, Convex Optimization – (Recap) Week 1: Introduction: Statistical Decision Theory – Regression, Classification, Bias Variance Week 2: Linear Regression, Multivariate Regression, Subset Selection, Shrinkage Methods, Principal Component Regression, Partial Least squares Week 3: Linear Classification, Logistic Regression, Linear Discriminant Analysis Week 4: Perceptron, Support Vector Machines Week 5: Neural Networks – Introduction, Early Models, Perceptron Learning, Backpropagation, Initialization, Training & Validation, Parameter Estimation – MLE, MAP, Bayesian Estimation Week 6: Decision Trees, Regression Trees, Stopping Criterion & Pruning loss functions, Categorical Attributes, Multiway Splits, Missing Values, Decision Trees – Instability Evaluation Measures Week 7: Bootstrapping & Cross Validation, Class Evaluation Measures, ROC curve, MDL, Ensemble Methods – Bagging, Committee Machines and Stacking, Boosting Week 8: Gradient Boosting, Random Forests, Multi-class Classification, Naive Bayes, Bayesian Networks Week 9: Undirected Graphical Models, HMM, Variable Elimination, Belief Propagation Week 10: Partitional Clustering, Hierarchical Clustering, Birch Algorithm, CURE Algorithm, Density-based Clustering Week 11: Gaussian Mixture Models, Expectation Maximization Week 12: Learning Theory, Introduction to Reinforcement Learning, Optional videos (RL framework, TD learning, Solution Methods, Applications)