欢迎您访问程序员文章站本站旨在为大家提供分享程序员计算机编程知识!
您现在的位置是: 首页

[机器学习实验2]Multivariate Linear Regression

程序员文章站 2022-06-11 22:34:02
...

感觉第二个实验和第一个实验基本是一样的。。不晓得为什么课程对这个问题做了两个实验,可能要加深对线性回归的理解吧,这个实验的回归变量增加了一个,有两个影响因素。题目如下,是预测房价的
[机器学习实验2]Multivariate Linear Regression
[机器学习实验2]Multivariate Linear Regression
因为和上篇实验基本一样,这里就不贴公式了,直接给贴代码

function MultivariateLinearRegression()
x = load('ex3x.dat');% 住房面积和卧室数目
y = load('ex3y.dat');% 房价
m = length(x);
x = [ones(m, 1), x];
%归一化,方便后面递归参数,以免递归时间过长
sigma = std(x);
mu = mean(x);
x(:,2) = (x(:,2) - mu(2))./ sigma(2);
x(:,3) = (x(:,3) - mu(3))./ sigma(3);

theta = zeros(size(x(1,:)))'; % initialize fitting parameters
alpha = 0.5;%% Your initial learning rate %%
J = zeros(50, 1); 
%迭代50次
for num_iterations = 1:50
    J(num_iterations) = 0.5*m*(x*theta-y)'*(x*theta-y);

    theta = theta-alpha/m*x'*(x*theta-y);
end

% now plot J
% technically, the first J starts at the zero-eth iteration
% but Matlab/Octave doesn't have a zero index
figure;
plot(0:49, J(1:50), '-')
xlabel('Number of iterations')
ylabel('Cost J')

theta = zeros(size(x(1,:)))'; % initialize fitting parameters
alpha = 0.01;%% Your initial learning rate %%
J1 = zeros(50, 1); 
%迭代50次
for num_iterations = 1:50
    J1(num_iterations) = 0.5*m*(x*theta-y)'*(x*theta-y);

    theta = theta-alpha/m*x'*(x*theta-y);
end

theta = zeros(size(x(1,:)))'; % initialize fitting parameters
alpha = 0.03;%% Your initial learning rate %%
J2= zeros(50, 1); 
%迭代50次
for num_iterations = 1:50
    J2(num_iterations) = 0.5*m*(x*theta-y)'*(x*theta-y);

    theta = theta-alpha/m*x'*(x*theta-y);
end

theta = zeros(size(x(1,:)))'; % initialize fitting parameters
alpha = 0.1;%% Your initial learning rate %%
J3= zeros(50, 1); 
%迭代50次
for num_iterations = 1:50
    J3(num_iterations) = 0.5*m*(x*theta-y)'*(x*theta-y);

    theta = theta-alpha/m*x'*(x*theta-y);
end
plot(0:49, J1(1:50), 'b-');
hold on;
plot(0:49, J2(1:50), 'r-');
plot(0:49, J3(1:50), 'k-');

theta = zeros(size(x(1,:)))'; % initialize fitting parameters
alpha = 1;%% Your initial learning rate %%

%迭代1000
for num_iterations = 1:100

    theta = theta-alpha/m*x'*(x*theta-y);
end
format long
theta

predic_X = [1,(1650- mu(2))/ sigma(2),(3 - mu(3))/ sigma(3)];
predic_Y = predic_X*theta;
predic_Y
end

最后的答案:
[机器学习实验2]Multivariate Linear Regression
然后最后这个实验里面做了Normal Equations的线性回归的方法,其实就是最小二乘法。
[机器学习实验2]Multivariate Linear Regression
这个公示的推导也很容易:
若J(θ)=0,则有h(x)=y,即θX=Y,然后求方程的解,这是一个超定方程(X是奇异矩阵,逆不存在),所以只能求最优解,通过广义逆求解得到的,具体的理解需要知道4个矩阵子空间的定义,在麻省理工的线代公开课里面我觉得讲的还是比较清楚的,按照子空间的定义也可以推导出来。

相关标签: 机器学习