2018년 3월 11일 일요일

Machine Learning (Week 3) : Logistic Regression

function g = sigmoid(z)
g = 1./(1.+exp(-z))

function [J, grad] = costFunction(theta, X, y)
A = sigmoid(X*theta);
J = (1/m) * sum(-y.*log(A) - (1.-y).*log(1.-A));
grad = (1/m) .* X' * (A - y);


function p = predict(theta, X)
p=round(sigmoid(X*theta));


%% Load Data
data = load('ex2data1.txt');
X = data(:, [1, 2]); y = data(:, 3);
[m, n] = size(X);

% Add intercept term to x and X_test
X = [ones(m, 1) X];

% Initialize fitting parameters
initial_theta = zeros(n + 1, 1);

%  Set options for fminunc
options = optimset('GradObj', 'on', 'MaxIter', 400);


%  Run fminunc to obtain the optimal theta
%  This function will return theta and the cost 

[theta, cost] = ...
 fminunc(@(t)(costFunction(t, X, y)), initial_theta, options);


function [J, grad] = costFunctionReg(theta, X, y, lambda)
A = sigmoid(X*theta);
theta1 = theta;
theta1(1,1) = 0;
reg = lambda/(2*m) * sum(theta1 .* theta1)
J = (1/m)*sum(-y.*log(A)-(1.-y).*log(1.-A)) + reg;


reg2 = lambda / (m) * theta;
reg2(1,1) = 0;
grad = (1/m).*X'*(A - y) + reg2;

댓글 없음:

댓글 쓰기