In this article, you will find **Coursera machine learning week 3 assignment answers – Andrew NG**. Use “Ctrl+F” To Find Any Questions or Answers. For Mobile Users, You Just Need To Click On Three dots In Your Browser & You Will Get A “Find” Option There. Use These Options to Get Any Random Questions Answer.

Try to solve all the assignments by yourself first, but if you get stuck somewhere then feel free to browse the code. Don’t just copy-paste the code for the sake of completion. Even if you copy the code, make sure you understand the code first.

In this exercise, you will implement logistic regression and apply it to two different datasets. Before starting the programming exercise, we strongly recommend watching the video lectures and completing the review questions for the associated topics.

### Coursera machine learning week 3 assignment answers – Andrew Ng

function plotData(X, y) %PLOTDATA Plots the data points X and y into a new figure % PLOTDATA(x,y) plots the data points with + for the positive examples % and o for the negative examples. X is assumed to be a Mx2 matrix. % ====================== YOUR CODE HERE ====================== % Instructions: Plot the positive and negative examples on a % 2D plot, using the option 'k+' for the positive % examples and 'ko' for the negative examples. % %Seperating positive and negative results pos = find(y==1); %index of positive results neg = find(y==0); %index of negative results % Create New Figure figure; %Plotting Positive Results on % X_axis: Exam1 Score = X(pos,1) % Y_axis: Exam2 Score = X(pos,2) plot(X(pos,1),X(pos,2),'g+'); %To keep above plotted graph as it is. hold on; %Plotting Negative Results on % X_axis: Exam1 Score = X(neg,1) % Y_axis: Exam2 Score = X(neg,2) plot(X(neg,1),X(neg,2),'ro'); % ========================================================================= hold off; end

function g = sigmoid(z) %SIGMOID Compute sigmoid function % g = SIGMOID(z) computes the sigmoid of z. % You need to return the following variables correctly g = zeros(size(z)); % ====================== YOUR CODE HERE ====================== % Instructions: Compute the sigmoid of each value of z (z can be a matrix, % vector or scalar). g = 1./(1+exp(-z)); % ============================================================= end

function [J, grad] = costFunction(theta, X, y) %COSTFUNCTION Compute cost and gradient for logistic regression % J = COSTFUNCTION(theta, X, y) computes the cost of using theta as the % parameter for logistic regression and the gradient of the cost % w.r.t. to the parameters. % Initialize some useful values m = length(y); % number of training examples % You need to return the following variables correctly J = 0; grad = zeros(size(theta)); % ====================== YOUR CODE HERE ====================== % Instructions: Compute the cost of a particular choice of theta. % You should set J to the cost. % Compute the partial derivatives and set grad to the partial % derivatives of the cost w.r.t. each parameter in theta % % Note: grad should have the same dimensions as theta % %DIMENSIONS: % theta = (n+1) x 1 % X = m x (n+1) % y = m x 1 % grad = (n+1) x 1 % J = Scalar z = X * theta; % m x 1 h_x = sigmoid(z); % m x 1 J = (1/m)*sum((-y.*log(h_x))-((1-y).*log(1-h_x))); % scalar grad = (1/m)* (X'*(h_x-y)); % (n+1) x 1 % ============================================================= end

function p = predict(theta, X) %PREDICT Predict whether the label is 0 or 1 using learned logistic %regression parameters theta % p = PREDICT(theta, X) computes the predictions for X using a % threshold at 0.5 (i.e., if sigmoid(theta'*x) >= 0.5, predict 1) m = size(X, 1); % Number of training examples % You need to return the following variables correctly p = zeros(m, 1); % ====================== YOUR CODE HERE ====================== % Instructions: Complete the following code to make predictions using % your learned logistic regression parameters. % You should set p to a vector of 0's and 1's % % Dimentions: % X = m x (n+1) % theta = (n+1) x 1 h_x = sigmoid(X*theta); p=(h_x>=0.5); %p = double(sigmoid(X * theta)>=0.5); % ========================================================================= end

function [J, grad] = costFunctionReg(theta, X, y, lambda) %COSTFUNCTIONREG Compute cost and gradient for logistic regression with regularization % J = COSTFUNCTIONREG(theta, X, y, lambda) computes the cost of using % theta as the parameter for regularized logistic regression and the % gradient of the cost w.r.t. to the parameters. % Initialize some useful values m = length(y); % number of training examples % You need to return the following variables correctly J = 0; grad = zeros(size(theta)); % ====================== YOUR CODE HERE ====================== % Instructions: Compute the cost of a particular choice of theta. % You should set J to the cost. % Compute the partial derivatives and set grad to the partial % derivatives of the cost w.r.t. each parameter in theta %DIMENSIONS: % theta = (n+1) x 1 % X = m x (n+1) % y = m x 1 % grad = (n+1) x 1 % J = Scalar z = X * theta; % m x 1 h_x = sigmoid(z); % m x 1 reg_term = (lambda/(2*m)) * sum(theta(2:end).^2); J = (1/m)*sum((-y.*log(h_x))-((1-y).*log(1-h_x))) + reg_term; % scalar grad(1) = (1/m)* (X(:,1)'*(h_x-y)); % 1 x 1 grad(2:end) = (1/m)* (X(:,2:end)'*(h_x-y))+(lambda/m)*theta(2:end); % n x 1 % ============================================================= end

**Disclaimer: **Hopefully, this article will be useful for you to find all the **Coursera machine learning week 3 assignment answers – Andrew NG** and grab some premium knowledge with less effort.

Finally, we are now, in the end, I just want to conclude some important message for you, Feel free to ask doubts in the comment section. I will try my best to answer it. If you find this helpful by any means like, comment, and share the post. Please share our posts on social media platforms and also suggest to your friends to Join Our Groups. Don’t forget to subscribe. This is the simplest way to encourage me to keep doing such work.

**Is Andrew Ng’s Machine Learning course good?**

It is the Best Course for Supervised Machine Learning! Andrew Ng Sir has been like always has such important & difficult concepts of Supervised ML with such ease and great examples, Just amazing!

**How do I get answers to coursera assignment?**

Use “Ctrl+F” To Find Any Questions Answered. & For Mobile Users, You Just Need To Click On Three dots In Your Browser & You Will Get A “Find” Option There. Use These Options to Get Any Random Questions Answer.

**How long does it take to finish coursera Machine Learning?**

this specialization requires approximately **3 months** with 75 hours of materials to complete, and I finished it in 3 weeks and spent an additional 1 week reviewing the whole course.

**How do you submit assignments on Coursera Machine Learning?**

Submit a programming assignment **Open the assignment page for the assignment you want to submit**. Read the assignment instructions and download any starter files. Finish the coding tasks in your local coding environment. Check the starter files and instructions when you need to. Reference