pytorch利用torch.optim.SGD和optimizer.step()實現梯度下降求解函數最小值

利用torch.optim.SGD類實現最小化函數
f(X)=(cos2x[0]+cos2x[1])2f(X)=-(cos^2x[0]+cos^2x[1])^2

from math import pi 
import torch
import torch.optim

x = torch.tensor([pi/3,pi/6],requires_grad=True)  #初始化
optimizer = torch.optim.SGD([x,],lr=0.1,momentum=0)
for step in range(11):
	if step:
		optimizer.zero_grad()
		f.backward()
		optimizer.step()   #求一次SDG
	f=-((x.cos()**2).sum())**2
	print('step {}: x={},f(x)={}'.format(step,x.tolist(),f))

輸出

step 0: x=[1.0471975803375244, 0.5235987901687622],f(x)=-1.0
step 1: x=[0.8739925026893616, 0.35039371252059937],f(x)=-1.674528956413269
step 2: x=[0.6192374229431152, 0.1835097223520279],f(x)=-2.6563119888305664
step 3: x=[0.3111077845096588, 0.06654246151447296],f(x)=-3.617122173309326
step 4: x=[0.08941137790679932, 0.016069628298282623],f(x)=-3.9671425819396973
step 5: x=[0.01855570822954178, 0.0032690390944480896],f(x)=-3.99858021736145
step 6: x=[0.0037171822041273117, 0.0006542906630784273],f(x)=-3.999943256378174
step 7: x=[0.0007434850558638573, 0.00013086199760437012],f(x)=-3.999997615814209
step 8: x=[0.00014869740698486567, 2.617243444547057e-05],f(x)=-4.0
step 9: x=[2.973947994178161e-05, 5.234485797700472e-06],f(x)=-4.0
step 10: x=[5.947895260760561e-06, 1.0468970685906243e-06],f(x)=-4.0
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章