W2编程作业——Linear Regression
程序员文章站
2022-06-11 22:31:44
...
必做
warmUpExercise.m
A = eye(5);
plotData.m
plot(x,y,'rx');
axis([4,24,-5,25]);
ylabel('Profit in $10,000s');
xlabel('Population of City in 10,000s');
computeCost.m
J=(1/(2*m))*sum((X*theta-y).^2);
gradientDscent.m
theta=theta-(alpha/m)*(X'*(X*theta-y));
选做
featureNormalize.m
mu=mean(X);
sigma=std(X);
X_norm=(X_norm-mu)./sigma;
computeCostMulti.m
J=1/(2*m)*(X*theta-y)'*(X*theta-y);
gradientDexcentMulti.m
theta=theta-(alpha/m)*(X'*(X*theta-y));
normalEqn
theta=(pinv(X'*X))*X'*y;
ex1_multi.m修改部分
Part 2: Gradient Descent
补充对比不同学习速率alpha
alpha2 = 0.0001;
alpha3 = 1;
theta2= zeros(3, 1);
theta3= zeros(3, 1);
[theta2, J_history2] = gradientDescentMulti(X, y, theta2, alpha2, num_iters);
[theta3, J_history3] = gradientDescentMulti(X, y, theta3, alpha3, num_iters);
figure;
plot(1:numel(J_history), J_history,'b','LineWidth', 2);
title("alpha=0.01,0.0001,1对比");
hold on;
plot(1:numel(J_history2), J_history2,'r','LineWidth', 2);
hold on;
plot(1:numel(J_history3), J_history3,'k','LineWidth', 2);
legend('alpha=0.01','alpha=0.0001','alpha=1')
xlabel('Number of iterations');
ylabel('Cost J');
预测值
predict_x=[1650,3];
predict_x=(predict_x-mu)./sigma;
price = [1,predict_x]*theta; % You should change this
Part 3: Normal Equations
预测值
price = [1,1650,3]*theta; % You should change this
提交作业
运行submit.m后
输入coursera注册时的邮箱
输入作业提交页面生成的识别码
【如果是下载的以前的作业,需要重新下载 submit.m 以及 lib/submitWithConfiguration.m】
上一篇: Cookie和Session的区别?
推荐阅读