NPTEL Introduction To Machine Learning Assignment 6 Answers

NPTEL Introduction To Machine Learning Week 6 Assignment 6 Answer: In this article, you will find NPTEL Introduction To Machine Learning Week 6 Assignment 6 Answer , Use “Ctrl+F” To Find Any Questions Answer. & For Mobile User, You Just Need To Click On Three dots In Your Browser & You Will Get A “Find” Option There. Use These Option to Get Any Random Questions Answer. 

Week 6 Answers Join this Group👇 Posting Soon

telegram
Note: We are trying to give our best so please share with your friends also. NPTEL Introduction To Machine Learning Assignment 6 Answers

NPTEL Introduction To Machine Learning Week 6 Assignment 6 Answer 2022 :-

Q1. Which of the following properties are characteristic of decision trees?

  • Low bias 
  • High variance
  • Lack of smoothness of prediction surfaces
  • Unbounded parameter set

Q.2. Consider the following dataset :

What is the initial entropy of Malignant?

  • 0.543
  • 0.9798
  • 0.8732
  • 1

NPTEL Introduction To Machine Learning Week 7 Assignment Answers Join this Group👇

telegram

 

Q.3. For the same dataset, what is the info gain of Vaccination?

  • 0.4763
  • 0.2102
  • 0.1134 
  • 0.9355

Q.4. Consider the following statements:

Statement 1: Decision Trees are linear non-parametric models.

Statement 2: A decision tree may be used to explain the complex function learned by a neural network.

  • Both the statements are True. 
  • Statement 1 is True, but Statement 2 is False.
  • Statement 1 is False, but Statement 2 is True. 
  • Both the statements are False.

Q.5. Which of the following machine learning models can solve the XOR problem without any transformations on the input space?  

  • Linear Perceptron 
  • Neural Networks
  • Decision Trees
  • Logistic Regression
NPTEL  Introduction To Machine Learning All Weeks Assignment Solution: Click Here

 

Q.6. Which of the following is/are major advantages of decision trees over other supervised learning techniques (Note that more than one choices may be correct)

  • Theoretical guarantees of performance
  • Higher performance
  • Interpretability of classifier
  • More powerful in its ability to represent complex functions

Q.7. Consider a dataset with only one attribute(categorical). Suppose there are q unordered values in this attribute. How many possible combinations are needed to find the best split-point for building the decision tree classifier?

  • q2
  • 2q-1 
  • 2q-1 – 1

For Any Changes In Answers Join this Group👇

telegram
This answer is provided by us only for discussion purpose if any answer will be getting wrong don’t blame us. If any doubt or suggestions regarding any question kindly comment. The solution is provided by Chase2learn. This tutorial is only for Discussion and Learning purpose.

About NPTEL Introduction To Machine Learning Course: 

With the increased availability of data from varied sources there has been increasing attention paid to the various data driven disciplines such as analytics and machine learning. In this course we intend to introduce some of the basic concepts of machine learning from a mathematically well motivated perspective. We will cover the different learning paradigms and some of the more popular algorithms and architectures used in each of these paradigms. If you have not registered for exam kindly register Through https://examform.nptel.ac.in/ COURSE LAYOUT The course structure and content covers, over a period of 8 weeks:
  • Week 0: Probability Theory, Linear Algebra, Convex Optimization – (Recap) 
  • Week 1: Introduction: Statistical Decision Theory – Regression, Classification, Bias Variance
  • Week 2: Linear Regression, Multivariate Regression, Subset Selection, Shrinkage Methods, Principal Component Regression, Partial Least squares
  • Week 3: Linear Classification, Logistic Regression, Linear Discriminant Analysis
  • Week 4: Perceptron, Support Vector Machines
  • Week 5: Neural Networks – Introduction, Early Models, Perceptron Learning, Backpropagation, Initialization, Training & Validation, Parameter Estimation – MLE, MAP, Bayesian Estimation
  • Week 6: Decision Trees, Regression Trees, Stopping Criterion & Pruning loss functions, Categorical Attributes, Multiway Splits, Missing Values, Decision Trees – Instability Evaluation Measures
  • Week 7: Bootstrapping & Cross Validation, Class Evaluation Measures, ROC curve, MDL, Ensemble Methods – Bagging, Committee Machines and Stacking, Boosting
  • Week 8: Gradient Boosting, Random Forests, Multi-class Classification, Naive Bayes, Bayesian Networks
  • Week 9: Undirected Graphical Models, HMM, Variable Elimination, Belief Propagation
  • Week 10: Partitional Clustering, Hierarchical Clustering, Birch Algorithm, CURE Algorithm, Density-based Clustering
  • Week 11: Gaussian Mixture Models, Expectation Maximization
  • Week 12: Learning Theory, Introduction to Reinforcement Learning, Optional videos (RL framework, TD learning, Solution Methods, Applications)

Leave a Comment