Regression Models QUIZ 2
it s the quiz from the Regression Models course .
Q1
Consider the following data with x as the predictor and y as as the outcome.
x <- c(0.61, 0.93, 0.83, 0.35, 0.54, 0.16, 0.91, 0.62, 0.62)
y <- c(0.67, 0.84, 0.6, 0.18, 0.85, 0.47, 1.1, 0.65, 0.36)
Give a P-value for the two sided hypothesis test of whether \beta_1β1 from a linear regression model is 0 or not.
>summary(lm(y ~ x))$coef
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 0.1885 0.2061 0.9143 0.39098
## x 0.7224 0.3107 2.3255 0.05296
Q2
Consider the previous problem, give the estimate of the residual standard deviation.
>summary(lm(y ~ x))$sigma
## [1] 0.223
Q3
In the \verb|mtcars|mtcars data set, fit a linear regression model of weight (predictor) on mpg (outcome). Get a 95% confidence interval for the expected mpg at the average weight. What is the lower endpoint?
data(mtcars)
fit <- lm(mpg ~ I(wt - mean(wt)), data = mtcars)
confint(fit)
## 2.5 % 97.5 %
## (Intercept) 18.991 21.190
## I(wt - mean(wt)) -6.486 -4.203
Q4
Refer to the previous question. Read the help file for mtcars. What is the weight coefficient interpreted as?
- The estimated expected change in mpg per 1 lb increase in weight.
- The estimated 1,000 lb change in weight per 1 mpg increase.
- The estimated expected change in mpg per 1,000 lb increase in weight. (Correct )
This is the standard interpretation of a regression coefficient. The expected change in the response per unit change in the predictor.
- It can't be interpreted without further information
Q5
Consider again the \verb|mtcars|mtcars data set and a linear regression model with mpg as predicted by weight (1,000 lbs). A new car is coming weighing 3000 pounds. Construct a 95% prediction interval for its mpg. What is the upper endpoint?
>fit <- lm(mpg ~ wt, data = mtcars)
>predict(fit, newdata = data.frame(wt = 3), interval = "prediction")
## fit lwr upr
## 1 21.25 14.93 27.57
Q6
Consider again the \verb|mtcars|mtcars data set and a linear regression model with mpg as predicted by weight (in 1,000 lbs). A “short” ton is defined as 2,000 lbs. Construct a 95% confidence interval for the expected change in mpg per 1 short ton increase in weight. Give the lower endpoint.
>fit <- lm(mpg ~ wt, data = mtcars)
>confint(fit)[2, ] * 2
## 2.5 % 97.5 %
## -12.973 -8.405
## Or equivalently change the units
>fit <- lm(mpg ~ I(wt * 0.5), data = mtcars)
>confint(fit)[2, ]
## 2.5 % 97.5 %
## -12.973 -8.405
Q7
If my X from a linear regression is measured in centimeters and I convert it to meters what would happen to the slope coefficient?
- It would get divided by 10
- It would get multiplied by 10
- It would get divided by 100
- It would get multiplied by 100. <Correct > (It would get multiplied by 100.)
Q8
I have an outcome, Y, and a predictor, X and fit a linear regression model with Y = \beta_0 + \beta_1 X + \epsilonY=β0 +β1 X+ϵ to obtain \hat \beta_0β^ 0 and \hat \beta_1β^ 1 . What would be the consequence to the subsequent slope and intercept if I were to refit the model with a new regressor, X + cX+c for some constant,C?
- The new intercept would be \hat \beta_0 + c \hat \beta_1β^ 0 +cβ^ 1
- The new intercept would be \hat \beta_0 - c \hat \beta_1β^ 0 −cβ^ 1 :Correct
-
This is exactly covered in the notes. But note that if Y = \beta_0 + \beta_1 X + \epsilonY=β0+β1X+ϵ then Y = \beta_0 - c\beta_1 + \beta_1 (X + c) + \epsilonY=β0−cβ1+β1(X+c)+ϵ so that the answer is that the intercept gets subtracted by c\beta_1cβ1
- The new slope would be c \hat \beta_1cβ^ 1
- The new slope would be \hat \beta_1 + cβ^ 1 +c
Q9
Refer back to the mtcars data set with mpg as an outcome and weight (wt) as the predictor. About what is the ratio of the the sum of the squared errors, \sum_{i=1}^n (Y_i - \hat Y_i)^2∑i=1n (Yi −Y^i )2 when comparing a model with just an intercept (denominator) to the model with the intercept and slope (numerator)?
>fit1 <- lm(mpg ~ wt, data = mtcars)
>fit2 <- lm(mpg ~ 1, data = mtcars)
>1 - summary(fit1)$r.squared
## [1] 0.2472
>sse1 <- sum((predict(fit1) - mtcars$mpg)^2)
>sse2 <- sum((predict(fit2) - mtcars$mpg)^2)
>sse1/sse2
## [1] 0.2472
推荐阅读
-
四、pix2pixHD代码解析(models搭建)
-
吴恩达机器学习课程ex2:Logistic Regression
-
吴恩达机器学习笔记Python--ex2(logistics regression)
-
Spark2 Linear Regression线性回归
-
pytorch学习笔记系列(2):实现Linear Regression
-
W2编程作业——Linear Regression
-
logistic regression(2)
-
Linear Regression WEEK2编程作业
-
[机器学习实验2]Multivariate Linear Regression
-
linear regression using TF(2)