迴歸算法-線性迴歸分析-過擬合欠擬合嶺迴歸

1.欠擬合與過擬合

機器學習中的泛化,泛化即是,模型學習到的概念在它處於學習的過程中時模型沒有遇見過的樣本時候的表現。在機器學習領域中,當我們討論一個機器學習模型學習和泛化的好壞時,我們通常使用術語:過擬合和欠擬合。我們知道模型訓練和測試的時候有兩套數據,訓練集和測試集。在對訓練數據進行擬合時,需要照顧到每個點,而其中有一些噪點,當某個模型過度的學習訓練數據中的細節和噪音,以至於模型在新的數據上表現很差,這樣的話模型容易複雜,擬合程度較高,造成過擬合。而相反如果值描繪了一部分數據那麼模型複雜度過於簡單,欠擬合指的是模型在訓練和預測時表現都不好的情況,稱爲欠擬合。

欠擬合例子:

經過訓練後,知道了天鵝是有翅膀的,天鵝的嘴巴是長長的。簡單的認爲有這些特徵的都是天鵝。因爲機器學習到的天鵝特徵太少了,導致區分標準太粗糙,不能準確識別出天鵝。

過擬合例子:

機器通過這些圖片來學習天鵝的特徵,經過訓練後,知道了天鵝是有翅膀的,天鵝的嘴巴是長長的彎曲的,天鵝的脖子是長長的有點曲度,天鵝的整個體型像一個"2"且略大於鴨子。這時候機器已經基本能區別天鵝和其他動物了。然後,很不巧已有的天鵝圖片全是白天鵝的,於是機器經過學習後,會認爲天鵝的羽毛都是白的,以後看到羽毛是黑的天鵝就會認爲那不是天鵝。

過擬合:一個假設在訓練數據上能夠獲得比其他假設更好的擬合, 但是在訓練數據外的數據集上卻不能很好地擬合數據,此時認爲這個假設出現了過擬合的現象。(模型過於複雜)

欠擬合:一個假設在訓練數據上不能獲得更好的擬合, 但是在訓練數據外的數據集上也不能很好地擬合數據,此時認爲這個假設出現了欠擬合的現象。(模型過於簡單)

                                      

對線性模型進行訓練學習會變成複雜模型:

2.解決過擬合的方法

在線性迴歸中,對於特徵集過小的情況,容易造成欠擬合(underfitting),對於特徵集過大的情況,容易造成過擬合(overfitting)。針對這兩種情況有了更好的解決辦法

    欠擬合

        原因:學習到數據的特徵過少

        解決辦法:增加數據的特徵數量

    過擬合

        原因:原始特徵過多,存在一些嘈雜特徵, 模型過於複雜是因爲模型嘗試去兼顧各個測試數據點

        解決辦法:

  • 進行特徵選擇,消除關聯性大的特徵(很難做)
  • 交叉驗證(讓所有數據都有過訓練)
  • 正則化

L2正則化

作用:可以使得W的每個元素都很小,都接近於0

優點:越小的參數說明模型越簡單,越簡單的模型則越不容易產生過擬合現象

4.Ridge(嶺迴歸)

  1. Ridge類實現了嶺迴歸模型。其原型爲:

class sklearn.linear_model.Ridge(alpha=1.0)
    alpha: 控制的是對模型正則化的程度。

               

對之前的波士頓房價預測使用嶺迴歸:

Python代碼實現:

# -*- coding: UTF-8 -*-
'''
@Author :Jason
'''
from sklearn.datasets import load_boston
from sklearn.linear_model import LinearRegression,SGDRegressor,Ridge
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import StandardScaler
from sklearn.metrics import mean_squared_error

def mylinear():
    '''
    波士頓放房價預測
    :return:
    '''
    #獲取數據
    lb = load_boston()

    #分割測試集到訓練集合測試集
    x_train, x_test, y_train, y_test = train_test_split(lb.data, lb.target, test_size=0.25)

    # 特徵工程-標準化
    std_x = StandardScaler()
    x_train = std_x.fit_transform(x_train)
    x_test = std_x.transform(x_test)
    # 目標值
    std_y = StandardScaler()
    y_train = std_y.fit_transform(y_train.reshape(-1, 1))
    y_test = std_y.transform(y_test.reshape(-1, 1))

    # 正則方程求解預測結果
    lr = LinearRegression()
    lr.fit(x_train, y_train)
    print(lr.coef_)
    # 預測的價格
    y_lr_predict = std_y.inverse_transform(lr.predict(x_test))
    print("使用正規方程每個房子的預測價格:", y_lr_predict)
    print("正規方程的均方誤差:",mean_squared_error(std_y.inverse_transform(y_test),y_lr_predict))
    # 使用梯度下降進行預測
    sgd = SGDRegressor()
    sgd.fit(x_train, y_train)
    print(lr.coef_)
    y_sgd_predict = std_y.inverse_transform(sgd.predict(x_test))

    print("使用梯度下降預測的房子價格:", y_sgd_predict)
    print("梯度下降的均方誤差:",mean_squared_error(std_y.inverse_transform(y_test),y_lr_predict))

    # 嶺迴歸進行預測
    rd = Ridge(alpha=1.0)
    rd.fit(x_train, y_train)
    print(rd.coef_)
    y_rd_predict = std_y.inverse_transform(rd.predict(x_test))

    print("使用梯度下降預測的房子價格:", y_rd_predict)
    print("梯度下降的均方誤差::", mean_squared_error(std_y.inverse_transform(y_test), y_lr_predict))
if __name__ == "__main__":
    mylinear()

結果:

[[-0.11463352  0.10198367  0.02231006  0.08456446 -0.1723333   0.31915769
   0.00298214 -0.31259924  0.27053024 -0.20229054 -0.22157962  0.09269368
  -0.41564546]]
使用正規方程每個房子的預測價格: [[13.37072047]
 [26.93738739]
 [28.19545157]
 [ 8.70985175]
 [16.72440905]
 [19.05843656]
 [14.26135991]
 [17.76652476]
 [35.18913369]
 [32.64936447]
 [27.5591566 ]
 [21.87827703]
 [25.00285921]
 [11.33717032]
 [22.19304173]
 [13.35795774]
 [24.06968308]
 [14.77941008]
 [19.57419242]
 [20.79322448]
 [24.46285962]
 [16.41224517]
 [31.59418299]
 [ 6.74611377]
 [19.01910372]
 [20.68475639]
 [20.12488826]
 [26.95872177]
 [36.23779434]
 [42.84455696]
 [14.94080084]
 [26.5445357 ]
 [28.52082584]
 [16.60404243]
 [23.4395676 ]
 [16.01863148]
 [18.03827065]
 [28.44751371]
 [20.71141696]
 [21.07119183]
 [24.64976302]
 [22.95224326]
 [14.51149149]
 [36.3300707 ]
 [19.33399057]
 [23.01857785]
 [21.27129575]
 [33.1274282 ]
 [32.60004603]
 [22.41354855]
 [20.96954387]
 [20.38899346]
 [18.39376511]
 [37.54577261]
 [34.96525154]
 [19.490239  ]
 [20.1435841 ]
 [22.59160076]
 [21.63470285]
 [13.89106994]
 [32.56365165]
 [23.24058142]
 [17.66492703]
 [24.50002664]
 [20.90663738]
 [23.32535887]
 [15.76483067]
 [25.44275365]
 [20.5441673 ]
 [31.6507649 ]
 [21.76157438]
 [ 6.09481724]
 [30.35529485]
 [ 7.63580748]
 [19.28338744]
 [23.84773445]
 [39.59621583]
 [21.42616329]
 [38.12163981]
 [21.51715207]
 [38.74571085]
 [16.5677445 ]
 [20.08657628]
 [32.3811511 ]
 [28.9536634 ]
 [13.86990902]
 [21.62537714]
 [29.15098862]
 [30.99847303]
 [19.37855993]
 [23.72260761]
 [19.90875463]
 [25.05175076]
 [30.87557208]
 [20.86130209]
 [ 8.24164039]
 [18.45063426]
 [22.39911798]
 [22.66322542]
 [18.29588163]
 [30.82505315]
 [10.07486629]
 [26.9426297 ]
 [17.09467262]
 [21.46555245]
 [22.65319887]
 [19.79651159]
 [13.07544428]
 [16.33661015]
 [ 7.2732533 ]
 [30.07644346]
 [19.28137638]
 [17.17377999]
 [33.85832317]
 [ 6.48514523]
 [18.93675523]
 [22.11815775]
 [12.07434388]
 [21.45163242]
 [36.20798721]
 [16.97858736]
 [14.73369484]
 [18.4359758 ]
 [14.56133073]
 [39.78965182]
 [23.92298055]
 [30.35945986]]
正規方程的均方誤差: 23.706588866665594
D:\Anaconda\lib\site-packages\sklearn\linear_model\stochastic_gradient.py:128: FutureWarning: max_iter and tol parameters have been added in <class 'sklearn.linear_model.stochastic_gradient.SGDRegressor'> in 0.19. If both are left unset, they default to max_iter=5 and tol=None. If tol is not None, max_iter defaults to max_iter=1000. From 0.21, default max_iter will be 1000, and default tol will be 1e-3.
  "and default tol will be 1e-3." % type(self), FutureWarning)
D:\Anaconda\lib\site-packages\sklearn\utils\validation.py:578: DataConversionWarning: A column-vector y was passed when a 1d array was expected. Please change the shape of y to (n_samples, ), for example using ravel().
  y = column_or_1d(y, warn=True)
[[-0.11463352  0.10198367  0.02231006  0.08456446 -0.1723333   0.31915769
   0.00298214 -0.31259924  0.27053024 -0.20229054 -0.22157962  0.09269368
  -0.41564546]]
使用梯度下降預測的房子價格: [13.9675434  26.25395517 28.82785138 11.34263995 19.26770236 19.34921704
 14.86438515 18.25076644 34.82306897 31.70422491 26.58583227 20.82171453
 24.90505059 10.68786332 21.95951366 14.13626638 26.36916421 16.99842214
 19.1582397  21.61437285 23.83356184 15.42570578 31.54189136  7.2133892
 18.97314698 21.18110511 20.96145856 28.66355969 36.07228837 40.40821745
 16.43764925 25.46671631 28.0378286  18.77263261 23.76765121 16.35723136
 18.61891127 28.20591292 18.7148766  21.20050035 24.51922052 22.332078
 17.15936663 34.92201288 19.7466164  22.76903901 21.85343597 31.5839049
 31.75005387 25.59089977 21.13431009 21.01061578 18.67103962 37.16702628
 33.09951354 19.7058571  19.82079479 22.5169791  24.06339415 14.66701301
 31.12677295 22.09256881 18.09550166 24.57011575 20.83991004 22.73646668
 17.68606482 25.23566533 20.50835635 30.92357728 20.87142974  5.96217968
 30.82924562  6.76415866 18.97372961 24.40999227 38.78483334 21.15129923
 37.51614735 21.71962147 38.58869574 17.23211894 20.88687759 32.4824928
 28.72741693 14.70249025 22.34185217 28.32322741 31.15034863 19.8137511
 23.28816382 20.44721455 24.7367006  30.02339054 20.38853641  8.66186931
 18.43006622 22.89150979 20.56618397 17.63121468 30.89378931 10.53705615
 25.7857825  16.79039751 21.02986393 20.47642314 20.36143414 13.73161921
 17.61441538  8.43848624 30.6001853  19.84180816 17.08200907 32.36027075
  7.59039982 19.79261747 21.4614797  12.89316816 22.99702567 35.71645671
 17.68714071 14.72835352 18.76093134 14.32522529 40.99299445 23.87878656
 29.76536528]
梯度下降的均方誤差: 23.706588866665594
[[-0.11362487  0.09998654  0.01888465  0.08518396 -0.16902613  0.319941
   0.00234247 -0.30921649  0.26045584 -0.1925429  -0.22055136  0.09263432
  -0.41393404]]
使用梯度下降預測的房子價格: [[13.37415761]
 [26.90504784]
 [28.21563998]
 [ 8.86419105]
 [16.86943034]
 [19.04862675]
 [14.29283925]
 [17.76151021]
 [35.1575154 ]
 [32.61764177]
 [27.50665061]
 [21.8434022 ]
 [24.99991704]
 [11.306863  ]
 [22.1802905 ]
 [13.39641171]
 [24.14199075]
 [14.89035585]
 [19.54318659]
 [20.82369319]
 [24.44051208]
 [16.36772372]
 [31.61270652]
 [ 6.75438103]
 [18.99650994]
 [20.68026055]
 [20.15671857]
 [26.9938016 ]
 [36.22804286]
 [42.74244097]
 [15.05427772]
 [26.50909895]
 [28.49514409]
 [16.71213297]
 [23.47915839]
 [16.03802518]
 [18.03643845]
 [28.45703619]
 [20.59538792]
 [21.05345456]
 [24.65311693]
 [22.91495741]
 [14.66000564]
 [36.28741567]
 [19.32824165]
 [23.02337164]
 [21.29647535]
 [33.0561231 ]
 [32.55469415]
 [22.57458257]
 [20.95378829]
 [20.38618478]
 [18.40871609]
 [37.51218936]
 [34.88916543]
 [19.51717166]
 [20.11309191]
 [22.59152474]
 [21.71350625]
 [13.89935179]
 [32.50744279]
 [23.18534185]
 [17.68366164]
 [24.50252582]
 [20.88398045]
 [23.30830851]
 [15.82013862]
 [25.45925461]
 [20.54611204]
 [31.64406665]
 [21.71465182]
 [ 6.08680787]
 [30.41068754]
 [ 7.6031304 ]
 [19.26933773]
 [23.83941718]
 [39.55156596]
 [21.40980019]
 [38.07670467]
 [21.54367529]
 [38.71553347]
 [16.56933165]
 [20.11740868]
 [32.39646812]
 [28.93972416]
 [13.87801213]
 [21.63966626]
 [29.10083299]
 [31.01310786]
 [19.39937608]
 [23.71030779]
 [19.93632761]
 [25.04560084]
 [30.83681385]
 [20.8404729 ]
 [ 8.24766238]
 [18.43188882]
 [22.40756311]
 [22.54313681]
 [18.25810087]
 [30.8338003 ]
 [10.07691145]
 [26.9028733 ]
 [17.06676763]
 [21.46393621]
 [22.53018004]
 [19.81817305]
 [13.1091368 ]
 [16.41082874]
 [ 7.30474261]
 [30.11268239]
 [19.31693979]
 [17.18487629]
 [33.81121555]
 [ 6.51419411]
 [18.96562504]
 [22.10120331]
 [12.0857933 ]
 [21.50510699]
 [36.17705284]
 [17.02770956]
 [14.71783414]
 [18.42723503]
 [14.53813717]
 [39.80049069]
 [23.94009425]
 [30.32535077]]
梯度下降的均方誤差:: 23.706588866665594

 

發佈了128 篇原創文章 · 獲贊 166 · 訪問量 51萬+
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章