Python实现的简单线性回归算法实例分析
本文实例讲述了python实现的简单线性回归算法。分享给大家供大家参考,具体如下:
用python实现r的线性模型(lm)中一元线性回归的简单方法,使用r的women示例数据,r的运行结果:
> summary(fit)
call:
lm(formula = weight ~ height, data = women)
residuals:
min 1q median 3q max
-1.7333 -1.1333 -0.3833 0.7417 3.1167
coefficients:
estimate std. error t value pr(>|t|)
(intercept) -87.51667 5.93694 -14.74 1.71e-09 ***
height 3.45000 0.09114 37.85 1.09e-14 ***
---
signif. codes: 0 ‘***' 0.001 ‘**' 0.01 ‘*' 0.05 ‘.' 0.1 ‘ ' 1
residual standard error: 1.525 on 13 degrees of freedom
multiple r-squared: 0.991, adjusted r-squared: 0.9903
f-statistic: 1433 on 1 and 13 df, p-value: 1.091e-14
python实现的功能包括:
- 计算pearson相关系数
- 使用最小二乘法计算回归系数
- 计算拟合优度判定系数r2r2
- 计算估计标准误差se
- 计算显著性检验的f和p值
import numpy as np import scipy.stats as ss class lm: """简单一元线性模型,计算回归系数、拟合优度的判定系数和 估计标准误差,显著性水平""" def __init__(self, data_source, separator): self.beta = np.matrix(np.zeros(2)) self.yhat = np.matrix(np.zeros(2)) self.r2 = 0.0 self.se = 0.0 self.f = 0.0 self.msr = 0.0 self.mse = 0.0 self.p = 0.0 data_mat = np.genfromtxt(data_source, delimiter=separator) self.xarr = data_mat[:, :-1] self.yarr = data_mat[:, -1] self.ybar = np.mean(self.yarr) self.dfd = len(self.yarr) - 2 # *度n-2 return # 计算协方差 @staticmethod def cov_custom(x, y): result = sum((x - np.mean(x)) * (y - np.mean(y))) / (len(x) - 1) return result # 计算相关系数 @staticmethod def corr_custom(x, y): return lm.cov_custom(x, y) / (np.std(x, ddof=1) * np.std(y, ddof=1)) # 计算回归系数 def simple_regression(self): xmat = np.mat(self.xarr) ymat = np.mat(self.yarr).t xtx = xmat.t * xmat if np.linalg.det(xtx) == 0.0: print('can not resolve the problem') return self.beta = np.linalg.solve(xtx, xmat.t * ymat) # xtx.i * (xmat.t * ymat) self.yhat = (xmat * self.beta).flatten().a[0] return # 计算拟合优度的判定系数r方,即相关系数corr的平方 def r_square(self): y = np.mat(self.yarr) ybar = np.mean(y) self.r2 = np.sum((self.yhat - ybar) ** 2) / np.sum((y.a - ybar) ** 2) return # 计算估计标准误差 def estimate_deviation(self): y = np.array(self.yarr) self.se = np.sqrt(np.sum((y - self.yhat) ** 2) / self.dfd) return # 显著性检验f def sig_test(self): ybar = np.mean(self.yarr) self.msr = np.sum((self.yhat - ybar) ** 2) self.mse = np.sum((self.yarr - self.yhat) ** 2) / self.dfd self.f = self.msr / self.mse self.p = ss.f.sf(self.f, 1, self.dfd) return def summary(self): self.simple_regression() corr_coe = lm.corr_custom(self.xarr[:, -1], self.yarr) self.r_square() self.estimate_deviation() self.sig_test() print('the pearson\'s correlation coefficient: %.3f' % corr_coe) print('the regression coefficient: %s' % self.beta.flatten().a[0]) print('r square: %.3f' % self.r2) print('the standard error of estimate: %.3f' % self.se) print('f-statistic: %d on %s and %s df, p-value: %.3e' % (self.f, 1, self.dfd, self.p))
python执行结果:
the regression coefficient: [-87.51666667 3.45 ]
r square: 0.991
the standard error of estimate: 1.525
f-statistic: 1433 on 1 and 13 df, p-value: 1.091e-14
其中求回归系数时用矩阵转置求逆再用numpy
内置的解线性方程组的方法是最快的:
a = np.mat(women.xarr); b = np.mat(women.yarr).t timeit (a.i * b) 99.9 µs ± 941 ns per loop (mean ± std. dev. of 7 runs, 10000 loops each) timeit ata.i * (a.t*b) 64.9 µs ± 717 ns per loop (mean ± std. dev. of 7 runs, 10000 loops each) timeit np.linalg.solve(ata, a.t*b) 15.1 µs ± 126 ns per loop (mean ± std. dev. of 7 runs, 100000 loops each)
更多关于python相关内容感兴趣的读者可查看本站专题:《python数学运算技巧总结》、《python数据结构与算法教程》、《python函数使用技巧总结》、《python字符串操作技巧汇总》及《python入门与进阶经典教程》
希望本文所述对大家python程序设计有所帮助。
上一篇: Vue-CLI3.x 设置反向代理的方法
下一篇: 打印指定年份的日历