Scipy.Optimize

目錄

1.Scipy.Optimize.Minimize

例子1:數學問題,非線性函數在約定條件下求解最小值問題

例子2:機器學習尋找參數


 

1.Scipy.Optimize.Minimize

Source:   https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.minimize.html#scipy.optimize.minimize

 

scipy.optimize.minimize(funx0args=()method=Nonejac=Nonehess=Nonehessp=Nonebounds=Noneconstraints=()tol=Nonecallback=Noneoptions=None)[source]

fun callable

The objective function to be minimized.

fun(x, *args) -> float

where x is an 1-D array with shape (n,) and args is a tuple of the fixed parameters needed to completely specify the function.

#x是一維數組,目標函數所需的固定參數的元組,由後面的args初始化

x0 ndarray, shape (n,)

Initial guess. Array of real elements of size (n,), where ‘n’ is the number of independent variables.

#初始化自變量參數

args tuple, optional

Extra arguments passed to the objective function and its derivatives (funjacand hess functions).

#傳遞給目標函數的額外參數

method:  str or callable, optional

Method for computing the gradient vector.  

#計算梯度的方法

例子1:數學問題,非線性函數在約定條件下求解最小值問題

Source:http://apmonitor.com/che263/index.php/Main/PythonOptimization

min x1x4(x1+x2+x3)+x3 

s.t.

x1x2x3x4 \geq 25

x1^{2} + x2^{2} + x3^{2} + x4^{2} = 40

1 \leq x1,x2,x3,x4 \leq 5

x0 = (1,5,5,1)

Scipy.Optimize.Minimize is demonstrated for solving a nonlinear objective function

subject to general inequality and equality constraints.

def objective(x):
    x1 = x[0]
    x2 = x[1]
    x3 = x[2]
    x4 = x[3]
    return x1*x4*(x1+x2+x3)+x3

def constraint1(x):
    return x[0]*x[1]*x[2]*x[3]-25.0

def constraint2(x):
    sum_sq = 40
    for i in range(4):
        sum_sq = sum_sq - x[i]**2
    return sum_sq
x0 = [1, 5 , 5, 1]
print(objective(x0))
#16
b = (1.0 , 5.0)
bnds = (b,b,b,b)
cons1 = {'type' : 'ineq' , 'fun' : constraint1}
cons2 = {'type' : 'eq' , 'fun' : constraint2}
cons = [cons1,cons2]
sol = minimize(objective, x0, method='SLSQP', bounds=bnds, constraints=cons)
print(sol)
    fun: 17.01401724563517
     jac: array([14.57227015,  1.37940764,  2.37940764,  9.56415057])
 message: 'Optimization terminated successfully.'
    nfev: 30
     nit: 5
    njev: 5
  status: 0
 success: True
       x: array([1.        , 4.7429961 , 3.82115462, 1.37940765])

例子2:機器學習尋找參數

在吳恩達cs229的機器學習

作業1中 linear_regression中,通過Gradient Descent方法,設置學習率,循環次數,進行尋找參數

而在作業2中 logistic_regression中,我們採用scipy.optimize,minimize,去尋找參數。

詳細細節,請參考作業,這裏只貼出關鍵代碼

Source : https://study.163.com/course/courseMain.htm?courseId=1004570029&_trace_c_p_k2_=00b4467624554d9fae0c292a58af0a45

theta = theta=np.zeros(3) 

def cost(theta, X, y):
    costf = np.mean(-y * np.log(sigmoid(X.dot(theta))) - (1 - y) * np.log(1 - sigmoid(X @ theta)))
    return costf

def gradient(theta, X, y):   
    grad = (1 / len(X)) * X.T @ (sigmoid(X @ theta) - y)
    return grad
import scipy.optimize as opt
res = opt.minimize(fun=cost, x0=theta, args=(X, y), method='Newton-CG', jac=gradient)
print(res)


 

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章