sklearn各模塊詳解(1)

四、.svm

from sklearn import svm

4.1 分類

  1. svm.SVC(C=1.0, kernel=’rbf’, degree=3, gamma=’auto_deprecated’, coef0=0.0, shrinking=True, probability=False, tol=0.001, cache_size=200, class_weight=None, verbose=False, max_iter=-1, decision_function_shape=’ovr’, random_state=None)

C
kernel:指定核函數:linear,polynomial,rbf,sigmoid。還可以是自定義的核函數。
degree:當kernel指定爲polynomial時,指定多項式次數。
gamma:當kernel爲非linear時,指定γ\gamma的值。
coef0:當kernelpolynomialsigmoid時,指定r的值。
shrinking
probability
tol
cache_size
class_weight:在fit方法中設置,用於樣本不平衡問題。
verbose
max_iter
decision_function_shapeovo表示一對一,ovr表示一對剩下的,
random_state:略


  1. svm.NuSVC(nu=0.5, kernel=’rbf’, degree=3, gamma=’auto_deprecated’, coef0=0.0, shrinking=True, probability=False, tol=0.001, cache_size=200, class_weight=None, verbose=False, max_iter=-1, decision_function_shape=’ovr’, random_state=None)

  1. svm.LinearSVC(penalty=’l2’, loss=’squared_hinge’, dual=True, tol=0.0001, C=1.0, multi_class=’ovr’, fit_intercept=True, intercept_scaling=1, class_weight=None, verbose=0, random_state=None, max_iter=1000)

4.2 迴歸

  1. svm.SVR(kernel=’rbf’, degree=3, gamma=’auto_deprecated’, coef0=0.0, tol=0.001, C=1.0, epsilon=0.1, shrinking=True, cache_size=200, verbose=False, max_iter=-1)

  1. svm.NuSVR(nu=0.5, C=1.0, kernel=’rbf’, degree=3, gamma=’auto_deprecated’, coef0=0.0, shrinking=True, tol=0.001, cache_size=200, verbose=False, max_iter=-1)

  1. svm.LinearSVR(epsilon=0.0, tol=0.0001, C=1.0, loss=’epsilon_insensitive’, fit_intercept=True, intercept_scaling=1.0, dual=True, verbose=0, random_state=None, max_iter=1000)

屬性:
support_vectors_
support_
n_support
.decision_function
dual_coef_yiαiy_i \alpha_iαiαi\alpha_i - \alpha_i^*
intercept_:略



五、.neighbors

  1. .NearestNeighbors(n_neighbors=5, radius=1.0, algorithm=’auto’, leaf_size=30, metric=’minkowski’, p=2, metric_params=None, n_jobs=None, **kwargs)

n_neighbors:近鄰數,即選擇最近的幾個樣本來分類。
radius
algorithm:算法:‘auto’, ‘ball_tree’, ‘kd_tree’, ‘brute`


  1. .KDTree

  1. .BallTree

方法:



發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章