Linear Regression WEEK2编程作业
1 Init
In the fi le warmUpExercise.m, you will find the outline of an Octave/MATLAB function. Modify it to return a 5x5 identity matrix by lling in the following code:
A = eye(5);
- 1
- 1
2.1 Plotting the Data
Before starting on any task, it is often useful to understand the data by visualizing it. For this dataset, you can use a scatter plot to visualize the data, since it has only two properties to plot (pro t and population). (Many other problems that you will
encounter in real life are multi-dimensional and can’t be plotted on a 2-d plot.)
In ex1.m, the dataset is loaded from the data le into the variables X and y:
data = load('ex1data1.txt');
X = data(:, 1); y = data(:, 2);
m = length(y);
- 1
- 2
- 3
- 1
- 2
- 3
Next, the script calls the plotData function to create a scatter plot of the data. Your job is to complete plotData.m to draw the plot; modify the file and fi ll in the following code:
plot(x, y, 'rx', 'MarkerSize', 10);
ylabel('Profit in $10,000s');
xlabel('Population of City in 10,000s');
- 1
- 2
- 3
- 1
- 2
- 3
2.2.3 Computing the cost J(θ)
Your next task is to complete the code in the fi le computeCost.m, which is a function that computes J(θ). As you are doing this, remember that the variables X and y are not scalar values, but matrices whose rows represent the examples
from the training set.
(实现方法是计算平方误差)
% Initialize some useful values
m = length(y); % number of training examples
% You need to return the following variables correctly
J = 0;
% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta
% You should set J to the cost.
predictions = X * theta;
sqrErrors = (predictions - y) .^ 2;
J = 1 / (2 * m) * sum(sqrErrors)
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
2.2.4 Gradient descent
Next, you will implement gradient descent in the fi le gradientDescent.m. The loop structure has been written for you, and you only need to supply the updates to θ within each iteration.
θ的更新方式使用矩阵对θ0与θ1进行同时更新,因此需要对梯度下降的方程提前计算和式,此处可以使用转置相乘也可以.*获取内积后使用sum()函数求和。
for all j, j = 0, 1 here
θj:=θj−α1m(hθ(xi)−yi)xij
θ:=θ−αδ
δ=[δ0δ1],
δ0=1m∑mi=1(hθ(xi)−yi)xi0
for iter = 1:num_iters
% ====================== YOUR CODE HERE======================
% Instructions: Perform a single gradient step on the parameter vector
% theta.
%
% Hint: While debugging, it can be useful to print out the values
% of the cost function (computeCost) and gradient here.
%
S = (1 / m) * (X' * (X * theta - y));
theta = theta - alpha .* S;
%============================================================
% Save the cost J in every iteration
J_history(iter) = computeCost(X, y, theta);
end
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
Optional Exercises
3.1 Feature Normalization
Your task here is to complete the code in featureNormalize.m to
-
Subtract the mean value of each feature from the dataset.
-
After subtracting the mean, additionally scale (divide) the feature values
by their respective “standard deviations.”
% You need to set these values correctly
X_norm = X;
mu = zeros(1, size(X, 2));
sigma = zeros(1, size(X, 2));
% ====================== YOUR CODE HERE ======================
% Instructions: First, for each feature dimension, compute the mean
% of the feature and subtract it from the dataset,
% storing the mean value in mu. Next, compute the
% standard deviation of each feature and divide
% each feature by it's standard deviation, storing
% the standard deviation in sigma.
%
% Note that X is a matrix where each column is a
% feature and each row is an example. You need
% to perform the normalization separately for
% each feature.
%
% Hint: You might find the 'mean' and 'std' functions useful.
%
for iter = 1:size(X, 2)
mu(1, iter) = mean(X(:, iter));
sigma(1, iter) = std(X(:, iter));
X_norm(:, iter) = (X_norm(:, iter) - mu(1, iter)) ./ sigma(1, iter)
% ============================================================
end
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 21
- 22
- 23
- 24
- 25
- 26
- 27
- 28
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 21
- 22
- 23
- 24
- 25
- 26
- 27
- 28
3.2 Gradient Descent
You should complete the code in computeCostMulti.m and gradientDescentMulti.m to implement the cost function and gradient descent for linear regression with multiple variables. If your code in the previous part (single
variable) already supports multiple variables, you can use it here too.
(此处可以复用单变量线性回归实现,因为上述方法为向量化的矩阵操作,同样适用于多变量)
3.3 Normal Equations
Complete the code in normalEqn.m to use the formula above to calculate . Remember that while you don’t need to scale your features, we still need to add a column of 1’s to the X matrix to have an intercept term (0). The code in ex1.m will add the column of 1’s to X for you.
正规方程计算θ : θ=(XTX)−1XTy
theta = zeros(size(X, 2), 1);
% ====================== YOUR CODE HERE ======================
% Instructions: Complete the code to compute the closed form solution
% to linear regression and put the result in theta.
%
% ---------------------- Sample Solution ----------------------
theta = pinv((X'* X)) * X' * y;
% ------------------------------------------
上一篇: 简单实现JavaScript弹幕效果
下一篇: regression
推荐阅读
-
吴恩达 机器学习课程 coursera 第一次编程作业(Linear Regression Multi) python实现
-
吴恩达 机器学习课程 coursera 第二次编程作业(Logistic Regression Regularized) python实现
-
W2编程作业——Linear Regression
-
Linear Regression WEEK2编程作业
-
Coursera Algorithm Ⅰ week2 编程作业 Deques and Randomized Queues
-
吴恩达 机器学习课程 coursera 第二次编程作业(Logistic Regression) python实现
-
吴恩达 机器学习课程 coursera 第一次编程作业(Linear Regression) python实现