Last active
October 2, 2015 17:49
-
-
Save TylerL-uxai/0ffe07b51b3bdd13c492 to your computer and use it in GitHub Desktop.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters) | |
%GRADIENTDESCENT Performs gradient descent to learn theta | |
% theta = GRADIENTDESENT(X, y, theta, alpha, num_iters) updates theta by | |
% taking num_iters gradient steps with learning rate alpha | |
% Initialize some useful values | |
m = length(y); % number of training examples | |
J_history = zeros(num_iters, 1); | |
for iter = 1:num_iters | |
% ====================== YOUR CODE HERE ====================== | |
% Instructions: Perform a single gradient step on the parameter vector | |
% theta. | |
% | |
% Hint: While debugging, it can be useful to print out the values | |
% of the cost function (computeCost) and gradient here. | |
% | |
h = X * theta; | |
err = h - y; | |
theta_change = ( 1 / m ) * ( alpha ) * (X'*err ); | |
% ============================================================ | |
% Save the cost J in every iteration | |
disp(class(X)); disp(class(y)); disp(class(theta)); | |
J_history(iter) = computeCost(X, y, theta); | |
end | |
end |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment