# NPTEL Introduction to Machine Learning Assignment 8 Answers 2023

Hello NPTEL Learners, In this article, you will find NPTEL Introduction to Machine Learning Assignment 8 Week 8 Answers 2023. All the Answers are provided below to help the students as a reference donâ€™t straight away look for the solutions, first try to solve the questions by yourself. If you find any difficulty, then look for the solutions.

###### NPTEL Introduction to Machine Learning Assignment 9 Answers 2023 Join GroupðŸ‘‡

Note: We are trying to give our best so please share with your friends also.

## NPTEL Introduction to Machine Learning Assignment 8 Answers 2023:

We are updating answers soon Join Group for update: CLICK HERE

#### Q.1. The Naive Bayes classifier makes the assumption that the ________are independent given the _________.

• features, class labels
• class labels, features
• features, data points
• there is no such assumption

• no
• yes

#### Q.3. A major problem of using the one vs. rest multi-class classification approach is:

• class imbalance
• increased time complexity

• 8
• 10
• 9
• 5

#### Q.5.In boosting, the weights of data points that were miscalssified are __________ as training progresses.

• decreased
• increased
• first decreased and then increased
• kept unchanged

#### Q.6. in a random forest model let m<<p be the number of randomly selected features that are used to identify the best split at any node of a tree. Which of the following are true? (p is the original number of features)

(Multiple options may be correct)

• increasingÂ mÂ reduces the correlation between any two trees in the forest
• decreasingÂ mÂ reduces the correlation between any two trees in the forest
• increasingÂ mÂ increases the performance of individual trees in the forest
• decreasingÂ mÂ increases the performance of individual trees in the forest

#### Q.7. Consider the following graphical model, which of the following are false about the model? (multiple options may be correct)

• A is independent of B when C is known
• D is independent of A when C is known
• D is not independent of A when B is known
• D is not independent of A when C is known

#### Q.8. Consider the Bayesian network given in the previous question. Let â€˜Aâ€™, â€˜Bâ€™, â€˜Câ€™, â€˜Dâ€™and â€˜Eâ€™denote the random variables shown in the network. Which of the following can be inferred from the network structure?

• â€˜Aâ€™causes â€˜Dâ€™
• â€˜Eâ€™causes â€˜Dâ€™
• â€˜Câ€™causes â€˜Aâ€™
• options (a) and (b) are correct
• none of the above can be inferred
##### NPTEL Introduction to Machine Learning Assignment 8 Answers Join GroupðŸ‘‡

Disclaimer: This answer is provided by us only for discussion purpose if any answer will be getting wrong donâ€™t blame us. If any doubt or suggestions regarding any question kindly comment. The solution is provided byÂ Chase2learn. This tutorial is only for Discussion andÂ LearningÂ purpose.

#### About NPTEL Introduction to Machine Learning Course:

With the increased availability of data from varied sources there has been increasing attention paid to the various data driven disciplines such as analytics and machine learning. In this course we intend to introduce some of the basic concepts of machine learning from a mathematically well motivated perspective. We will cover the different learning paradigms and some of the more popular algorithms and architectures used in each of these paradigms.

##### Course Outcome:
• Week 0:Â Probability Theory, Linear Algebra, Convex Optimization â€“ (Recap)
• Week 1:Â Introduction: Statistical Decision Theory â€“ Regression, Classification, Bias Variance
• Week 2:Â Linear Regression, Multivariate Regression, Subset Selection, Shrinkage Methods, Principal Component Regression, Partial Least squares
• Week 3:Â Linear Classification, Logistic Regression, Linear Discriminant Analysis
• Week 4:Â Perceptron, Support Vector Machines
• Week 5:Â Neural Networks â€“ Introduction, Early Models, Perceptron Learning, Backpropagation, Initialization, Training & Validation, Parameter Estimation â€“ MLE, MAP, Bayesian Estimation
• Week 6:Â Decision Trees, Regression Trees, Stopping Criterion & Pruning loss functions, Categorical Attributes, Multiway Splits, Missing Values, Decision Trees â€“ Instability Evaluation Measures
• Week 7:Â Bootstrapping & Cross Validation, Class Evaluation Measures, ROC curve, MDL, Ensemble Methods â€“ Bagging, Committee Machines and Stacking, Boosting
• Week 8:Â Gradient Boosting, Random Forests, Multi-class Classification, Naive Bayes, Bayesian Networks
• Week 9:Â Undirected Graphical Models, HMM, Variable Elimination, Belief Propagation
• Week 10:Â Partitional Clustering, Hierarchical Clustering, Birch Algorithm, CURE Algorithm, Density-based Clustering
• Week 11:Â Gaussian Mixture Models, Expectation Maximization
• Week 12:Â Learning Theory, Introduction to Reinforcement Learning, Optional videos (RL framework, TD learning, Solution Methods, Applications)
###### CRITERIA TO GET A CERTIFICATE:

Average assignment score = 25% of average of best 8 assignments out of the total 12 assignments given in the course.
Exam score = 75% of the proctored certification exam score out of 100

Final score = Average assignment score + Exam score

YOU WILL BE ELIGIBLE FOR A CERTIFICATE ONLY IF AVERAGE ASSIGNMENT SCORE >=10/25 AND EXAM SCORE >= 30/75. If one of the 2 criteria is not met, you will not get the certificate even if the Final score >= 40/100.

If you have not registered for exam kindly register Through https://examform.nptel.ac.in/