NPTEL Introduction To Machine Learning Assignment 8 Answers

NPTEL Introduction To Machine Learning Week 8 Assignment 8 Answer: In this article, you will find NPTEL Introduction To Machine Learning Week 8 Assignment 8 Answer , Use “Ctrl+F” To Find Any Questions Answer. & For Mobile User, You Just Need To Click On Three dots In Your Browser & You Will Get A “Find” Option There. Use These Option to Get Any Random Questions Answer. 

Week 8 Answers Updating Soon Join Group for Updates👇 

Note: We are trying to give our best so please share with your friends also. NPTEL Introduction To Machine Learning Assignment 8 Answers

NPTEL Introduction To Machine Learning Week 8 Assignment 8 Answer 2022 :-

Q1. The figure below shows a Bayesian Network with 9 variables, all of which are binary.

Which of the following is/are always true for the above Bayesian Network?Answer:

Q.2. Consider the following data for 20 budget phones, 30 mid-range phones, and 20 high-end phones:

Consider a phone with 2 SIM card slots and NFC but no 5G compatibility. Calculate the probabilities of this phone being a budget phone, a mid-range phone, and a high-end phone using the Naive Bayes method. The correct ordering of the phone type from the highest to the lowest probability is?

  • Budget, Mid-Range, High End 
  • Budget, High End, Mid-Range
  • Mid-Range, High End, Budget
  • High End, Mid-Range, Budget

NPTEL Introduction To Machine Learning Week 9 Assignment Answers Join this Group👇

 

Q.3. Consider the following dataset where outlook, temperature, humidity, and wind are independent features, and play is the dependent feature.

Find the probability that the student will not play given that x = (Outlook=sunny, Temperature=66, Humidity=90, Windy=True) using the Naive Bayes method. (Assume the continuous features are represented as Gaussian distributions).
  • 0.0001367
  • 0.0000358
  • 0.0000236
  • 1

Q.4. Which among Gradient Boosting and AdaBoost is less susceptible to outliers considering their respective loss functions?

  • AdaBoost
  • Gradient Boost
  • On average, both are equally susceptible

Q.5. How do you prevent overfitting in random forest models?  

  • Increasing Tree Depth.
  • Increasing the number of variables sampled at each split. 
  • Increasing the number of trees.
  • All of the above.
NPTEL  Introduction To Machine Learning All Weeks Assignment Solution: Click Here

 

Q.6. A dataset with two classes is plotted below.

Does the data satisfy the Naive Bayes assumption?
  • Yes
  • No
  • The given data is insufficient
  • None of these

Q.7. Ensembling in random forest classifier helps in achieving:

  • reduction of bias error 
  • reduction of variance error
  • reduction of data dimension 
  • none of the above

For Any Changes In Answers Join this Group👇

This answer is provided by us only for discussion purpose if any answer will be getting wrong don’t blame us. If any doubt or suggestions regarding any question kindly comment. The solution is provided by Chase2learn. This tutorial is only for Discussion and Learning purpose.

About NPTEL Introduction To Machine Learning Course: 

With the increased availability of data from varied sources there has been increasing attention paid to the various data driven disciplines such as analytics and machine learning. In this course we intend to introduce some of the basic concepts of machine learning from a mathematically well motivated perspective. We will cover the different learning paradigms and some of the more popular algorithms and architectures used in each of these paradigms. If you have not registered for exam kindly register Through https://examform.nptel.ac.in/ COURSE LAYOUT The course structure and content covers, over a period of 8 weeks:
  • Week 0: Probability Theory, Linear Algebra, Convex Optimization – (Recap) 
  • Week 1: Introduction: Statistical Decision Theory – Regression, Classification, Bias Variance
  • Week 2: Linear Regression, Multivariate Regression, Subset Selection, Shrinkage Methods, Principal Component Regression, Partial Least squares
  • Week 3: Linear Classification, Logistic Regression, Linear Discriminant Analysis
  • Week 4: Perceptron, Support Vector Machines
  • Week 5: Neural Networks – Introduction, Early Models, Perceptron Learning, Backpropagation, Initialization, Training & Validation, Parameter Estimation – MLE, MAP, Bayesian Estimation
  • Week 6: Decision Trees, Regression Trees, Stopping Criterion & Pruning loss functions, Categorical Attributes, Multiway Splits, Missing Values, Decision Trees – Instability Evaluation Measures
  • Week 7: Bootstrapping & Cross Validation, Class Evaluation Measures, ROC curve, MDL, Ensemble Methods – Bagging, Committee Machines and Stacking, Boosting
  • Week 8: Gradient Boosting, Random Forests, Multi-class Classification, Naive Bayes, Bayesian Networks
  • Week 9: Undirected Graphical Models, HMM, Variable Elimination, Belief Propagation
  • Week 10: Partitional Clustering, Hierarchical Clustering, Birch Algorithm, CURE Algorithm, Density-based Clustering
  • Week 11: Gaussian Mixture Models, Expectation Maximization
  • Week 12: Learning Theory, Introduction to Reinforcement Learning, Optional videos (RL framework, TD learning, Solution Methods, Applications)
Sharing Is Caring

Leave a Comment