吴恩达机器学习课程ex2:Logistic Regression
程序员文章站
2022-07-13 10:11:45
...
# plotData.m
function plotData(X, y)
figure; hold on;
pos = find(y==1);
neg = find(y==0);
plot(X(pos, 1), X(pos,2), 'b+', 'LineWidth',2,'MarkerSize',7);
plot(X(neg, 1), X(neg,2), 'ro', 'LineWidth',1,'MarkerSize',7,'MarkerFaceColor','r');
hold off;
end
# sigmoid.m
function g = sigmoid(z)
g = zeros(size(z));
g = 1./(1+exp(-z));
end
# costFunction.m
m = length(y);
J = 0;
grad = zeros(size(theta));
h_fuc = sigmoid(X * theta);
J = (-1/m) * ((y' * log(h_fuc)) + (ones(m,1) - y)' * log(ones(m,1) - h_fuc));
grad = (1/m) * X' * (h_fuc - y);
end
# costFunctionReg.m
m = length(y);
J = 0;
grad = zeros(size(theta));
h_fuc = sigmoid(X * theta);
J = (-1/m) * ((y' * log(h_fuc)) + (ones(m,1) - y)' * log(ones(m,1) - h_fuc))+ (theta(2:end)' * theta(2:end)) * lambda / (2*m);
grad = (1/m) * X' * (h_fuc - y);
for i=2:size(theta,1)
grad(i) = grad(i) + lambda / m * theta(i);
endfor
end
# predict.m
function p = predict(theta, X)
m = size(X, 1);
p = zeros(m, 1);
prob = sigmoid(X * theta);
threshold = ones(m,1) .* 0.5;
for i = 1:m
if(prob(i) >= threshold(i))
p(i) = 1;
endif
endfor
end
上一篇: 吴恩达 机器学习第二周 logistic_regression 单层网络梯度下降法实现
下一篇: PostgreSQL Oracle 兼容性之 - PL/SQL DETERMINISTIC 与PG函数稳定性(immutable, stable, volatile)...
推荐阅读
-
【含课程pdf & 测验答案】吴恩达-机器学习公开课 学习笔记 Week8-2 Dimensionality Reduction
-
【含课程pdf & 测验答案】吴恩达-机器学习公开课 学习笔记 Week2-2 Octave/Matlab Tutorial
-
【含课程pdf & 测验答案】吴恩达-机器学习公开课 学习笔记 Week8-1 Unsupervised Learning
-
Coursera吴恩达机器学习week4的ex3编程作业代码
-
Coursera吴恩达机器学习week5的ex4编程作业代码
-
Chapter7:正则化 Regularization:AndrewNg吴恩达《机器学习》笔记
-
Coursera吴恩达《神经网络与深度学习》课程笔记(3)-- 神经网络基础之Python与向量化
-
吴恩达 机器学习第二周 logistic_regression 单层网络梯度下降法实现
-
吴恩达机器学习课程ex2:Logistic Regression
-
Coursera吴恩达机器学习课程 第2周作业代码