線性迴歸是最基本的預測, 先隨機生成一堆大概線性相關的數據,
然後通過最小二乘法算出斜率和偏移值, 線性模型就能求出來了, 也能預測數據了
import numpy as np
import matplotlib.pyplot as plt
#數據集大小, 數據越多 a b就越準確
dataset_size = 100000
#干擾幅度, 干擾越小, ab越準確
amplitute = 200
#預測 y = 2x+3 的線性函數
x = np.random.randint(0,1000,dataset_size)
y = 2 * x + 3 + np.random.normal(size=dataset_size)*amplitute
print(x[:10])
print(y[:10])
plt.scatter(x[:50],y[:50])
plt.axis([0,3000,0,3000])
plt.show()
輸出結果:
[865 603 390 894 593 818 18 304 402 708]
[ 1528.27202848 955.29457768 854.84928664 1621.34212968 1309.90961998
1617.55872572 215.46963559 502.5849728 606.06870139 1862.69632193]
最小二乘法實現, 具體推到大家可以網上搜一下
#這裏使用最小二乘法求解線性迴歸的參數
x_mean = np.mean(x)
y_mean = np.mean(y)
m1 = 0 #分母
m2 = 0 #分子
for x_i, y_i in zip(x, y):
m1 += (x_i - x_mean) * (y_i - y_mean)
m2 += (x_i - x_mean) ** 2
a = m1/m2
b = y_mean - a*x_mean
print(a,b)
2.00067632091 3.07374278432
y_line = a*x + b
print(a,b)
plt.scatter(x[:50],y[:50])
plt.plot(x, y_line, color='r')
plt.axis([0,4000,0,4000])
plt.show()
2.00961672756 3.47528458815
線性迴歸就完了