TensorFlow2.0自動微分和手工求導的結果對比! |
一. 提出模型
- 本文對比了TensorFlow自動微分機制和手工求導的結果。
二. 模型求導
三. 實驗分析
3.1. 鳶尾花數據集Iris.txt
5.1,3.5,1.4,0.2,Iris-setosa
4.9,3.0,1.4,0.2,Iris-setosa
4.7,3.2,1.3,0.2,Iris-setosa
4.6,3.1,1.5,0.2,Iris-setosa
5.0,3.6,1.4,0.2,Iris-setosa
5.4,3.9,1.7,0.4,Iris-setosa
4.6,3.4,1.4,0.3,Iris-setosa
5.0,3.4,1.5,0.2,Iris-setosa
4.4,2.9,1.4,0.2,Iris-setosa
4.9,3.1,1.5,0.1,Iris-setosa
5.4,3.7,1.5,0.2,Iris-setosa
4.8,3.4,1.6,0.2,Iris-setosa
4.8,3.0,1.4,0.1,Iris-setosa
4.3,3.0,1.1,0.1,Iris-setosa
5.8,4.0,1.2,0.2,Iris-setosa
5.7,4.4,1.5,0.4,Iris-setosa
5.4,3.9,1.3,0.4,Iris-setosa
5.1,3.5,1.4,0.3,Iris-setosa
5.7,3.8,1.7,0.3,Iris-setosa
5.1,3.8,1.5,0.3,Iris-setosa
5.4,3.4,1.7,0.2,Iris-setosa
5.1,3.7,1.5,0.4,Iris-setosa
4.6,3.6,1.0,0.2,Iris-setosa
5.1,3.3,1.7,0.5,Iris-setosa
4.8,3.4,1.9,0.2,Iris-setosa
5.0,3.0,1.6,0.2,Iris-setosa
5.0,3.4,1.6,0.4,Iris-setosa
5.2,3.5,1.5,0.2,Iris-setosa
5.2,3.4,1.4,0.2,Iris-setosa
4.7,3.2,1.6,0.2,Iris-setosa
4.8,3.1,1.6,0.2,Iris-setosa
5.4,3.4,1.5,0.4,Iris-setosa
5.2,4.1,1.5,0.1,Iris-setosa
5.5,4.2,1.4,0.2,Iris-setosa
4.9,3.1,1.5,0.1,Iris-setosa
5.0,3.2,1.2,0.2,Iris-setosa
5.5,3.5,1.3,0.2,Iris-setosa
4.9,3.1,1.5,0.1,Iris-setosa
4.4,3.0,1.3,0.2,Iris-setosa
5.1,3.4,1.5,0.2,Iris-setosa
5.0,3.5,1.3,0.3,Iris-setosa
4.5,2.3,1.3,0.3,Iris-setosa
4.4,3.2,1.3,0.2,Iris-setosa
5.0,3.5,1.6,0.6,Iris-setosa
5.1,3.8,1.9,0.4,Iris-setosa
4.8,3.0,1.4,0.3,Iris-setosa
5.1,3.8,1.6,0.2,Iris-setosa
4.6,3.2,1.4,0.2,Iris-setosa
5.3,3.7,1.5,0.2,Iris-setosa
5.0,3.3,1.4,0.2,Iris-setosa
7.0,3.2,4.7,1.4,Iris-versicolor
6.4,3.2,4.5,1.5,Iris-versicolor
6.9,3.1,4.9,1.5,Iris-versicolor
5.5,2.3,4.0,1.3,Iris-versicolor
6.5,2.8,4.6,1.5,Iris-versicolor
5.7,2.8,4.5,1.3,Iris-versicolor
6.3,3.3,4.7,1.6,Iris-versicolor
4.9,2.4,3.3,1.0,Iris-versicolor
6.6,2.9,4.6,1.3,Iris-versicolor
5.2,2.7,3.9,1.4,Iris-versicolor
5.0,2.0,3.5,1.0,Iris-versicolor
5.9,3.0,4.2,1.5,Iris-versicolor
6.0,2.2,4.0,1.0,Iris-versicolor
6.1,2.9,4.7,1.4,Iris-versicolor
5.6,2.9,3.6,1.3,Iris-versicolor
6.7,3.1,4.4,1.4,Iris-versicolor
5.6,3.0,4.5,1.5,Iris-versicolor
5.8,2.7,4.1,1.0,Iris-versicolor
6.2,2.2,4.5,1.5,Iris-versicolor
5.6,2.5,3.9,1.1,Iris-versicolor
5.9,3.2,4.8,1.8,Iris-versicolor
6.1,2.8,4.0,1.3,Iris-versicolor
6.3,2.5,4.9,1.5,Iris-versicolor
6.1,2.8,4.7,1.2,Iris-versicolor
6.4,2.9,4.3,1.3,Iris-versicolor
6.6,3.0,4.4,1.4,Iris-versicolor
6.8,2.8,4.8,1.4,Iris-versicolor
6.7,3.0,5.0,1.7,Iris-versicolor
6.0,2.9,4.5,1.5,Iris-versicolor
5.7,2.6,3.5,1.0,Iris-versicolor
5.5,2.4,3.8,1.1,Iris-versicolor
5.5,2.4,3.7,1.0,Iris-versicolor
5.8,2.7,3.9,1.2,Iris-versicolor
6.0,2.7,5.1,1.6,Iris-versicolor
5.4,3.0,4.5,1.5,Iris-versicolor
6.0,3.4,4.5,1.6,Iris-versicolor
6.7,3.1,4.7,1.5,Iris-versicolor
6.3,2.3,4.4,1.3,Iris-versicolor
5.6,3.0,4.1,1.3,Iris-versicolor
5.5,2.5,4.0,1.3,Iris-versicolor
5.5,2.6,4.4,1.2,Iris-versicolor
6.1,3.0,4.6,1.4,Iris-versicolor
5.8,2.6,4.0,1.2,Iris-versicolor
5.0,2.3,3.3,1.0,Iris-versicolor
5.6,2.7,4.2,1.3,Iris-versicolor
5.7,3.0,4.2,1.2,Iris-versicolor
5.7,2.9,4.2,1.3,Iris-versicolor
6.2,2.9,4.3,1.3,Iris-versicolor
5.1,2.5,3.0,1.1,Iris-versicolor
5.7,2.8,4.1,1.3,Iris-versicolor
6.3,3.3,6.0,2.5,Iris-virginica
5.8,2.7,5.1,1.9,Iris-virginica
7.1,3.0,5.9,2.1,Iris-virginica
6.3,2.9,5.6,1.8,Iris-virginica
6.5,3.0,5.8,2.2,Iris-virginica
7.6,3.0,6.6,2.1,Iris-virginica
4.9,2.5,4.5,1.7,Iris-virginica
7.3,2.9,6.3,1.8,Iris-virginica
6.7,2.5,5.8,1.8,Iris-virginica
7.2,3.6,6.1,2.5,Iris-virginica
6.5,3.2,5.1,2.0,Iris-virginica
6.4,2.7,5.3,1.9,Iris-virginica
6.8,3.0,5.5,2.1,Iris-virginica
5.7,2.5,5.0,2.0,Iris-virginica
5.8,2.8,5.1,2.4,Iris-virginica
6.4,3.2,5.3,2.3,Iris-virginica
6.5,3.0,5.5,1.8,Iris-virginica
7.7,3.8,6.7,2.2,Iris-virginica
7.7,2.6,6.9,2.3,Iris-virginica
6.0,2.2,5.0,1.5,Iris-virginica
6.9,3.2,5.7,2.3,Iris-virginica
5.6,2.8,4.9,2.0,Iris-virginica
7.7,2.8,6.7,2.0,Iris-virginica
6.3,2.7,4.9,1.8,Iris-virginica
6.7,3.3,5.7,2.1,Iris-virginica
7.2,3.2,6.0,1.8,Iris-virginica
6.2,2.8,4.8,1.8,Iris-virginica
6.1,3.0,4.9,1.8,Iris-virginica
6.4,2.8,5.6,2.1,Iris-virginica
7.2,3.0,5.8,1.6,Iris-virginica
7.4,2.8,6.1,1.9,Iris-virginica
7.9,3.8,6.4,2.0,Iris-virginica
6.4,2.8,5.6,2.2,Iris-virginica
6.3,2.8,5.1,1.5,Iris-virginica
6.1,2.6,5.6,1.4,Iris-virginica
7.7,3.0,6.1,2.3,Iris-virginica
6.3,3.4,5.6,2.4,Iris-virginica
6.4,3.1,5.5,1.8,Iris-virginica
6.0,3.0,4.8,1.8,Iris-virginica
6.9,3.1,5.4,2.1,Iris-virginica
6.7,3.1,5.6,2.4,Iris-virginica
6.9,3.1,5.1,2.3,Iris-virginica
5.8,2.7,5.1,1.9,Iris-virginica
6.8,3.2,5.9,2.3,Iris-virginica
6.7,3.3,5.7,2.5,Iris-virginica
6.7,3.0,5.2,2.3,Iris-virginica
6.3,2.5,5.0,1.9,Iris-virginica
6.5,3.0,5.2,2.0,Iris-virginica
6.2,3.4,5.4,2.3,Iris-virginica
5.9,3.0,5.1,1.8,Iris-virginica
3.2. Python代碼實現
import numpy as np
import tensorflow as tf
import os
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'
batchsz = 150
# 自己實現對x求導數
def differential_x(x, y, w, b):
"""
:param x: 維度爲: 150 * 2
:param y: 維度爲: 150 * 1
:param w: 權重矩陣: 150 * 3
:param b: 偏置向量: 3 * 1
:return:
"""
n = np.size(x, 1) # n表示:樣本數150
k = np.size(x, 0) # k表示:每個樣本的特徵數2
c = len(b) # c表示:類別數
e = np.ones([n, 1], dtype=np.float) # 全1列向量
ksi = np.zeros([c, n], dtype=np.float) # 和標籤維度一樣:3 * 150;代表:1 - yi * (w.T @ x + b)
lam = 1. # 懲罰係數爲:1.
dj2 = np.zeros([k, n], dtype=np.float) # 倒數的格式爲:k * n
for i in range(n): # 遍歷樣本,分別對於每個樣本進行。
xi = np.reshape(x[:, i], [k, 1]) # 第 i 個樣本
yi = np.reshape(y[:, i], [c, 1]) # 第 i 個標籤
# 結果ksii維度爲:[3, 2] @ [2, 1] => [3, 1] 正好是三類。可C。
ksii = (w.T @ xi + b) * yi # w維度: k * c;這裏爲:2 * 3
# 分2種情況進行討論!
loss_index = np.reshape(ksii < 1, [c, ]) # 找出小於1的位置座標,小於1爲false,否則爲True; eg: [False True True]
# ksi表示: 1 - yi(w.T @ x + b)
ksi[:, i] = 1 - ksii[:, 0]
# 第一種情況:yi * (w.T @ x + b) < 1,即是:1 - yi * (w.T @ x + b) < 0;爲真True,也就是一個樣本的3個類別輸出的都滿足。可以理解爲硬間隔,hinge loss=0
if 0 not in loss_index:
continue # 跳出本次for循環。
# 第二種情況:yi * (w.T @ x + b) < 1,即是:1 - yi * (w.T @ x + b) >= 0; 爲假False,存在hinge loss 不爲0
else:
yc = yi[loss_index, :] # 找出樣本中爲True的類別
bc = b[loss_index] # 找出爲True對應的偏置b
wc = w[:, loss_index] # 找出爲True對應的超平面wc
# a1 = wc @ bc
# a2 = wc @ yc
a1 = 2 * lam * ((wc @ wc.T @ xi) + (wc @ bc) - wc @ yc)
dj2[:, i] = 2 * lam * ((wc @ wc.T @ xi) + (wc @ bc) - wc @ yc)[:, 0]
# 跳出for循環
ksi[ksi < 0] = 0
j1 = np.trace(x.T @ x) - (1 / n) * (e.T @ x.T @ x @ e) + (n - 1) / lam
dj1 = 2 * x - (2 / n) * x @ e @ e.T
j2 = np.trace(w.T @ w ) + np.sum(np.square(ksi))
dl_dx = dj1 * j2 + j1 * dj2
# 返回的維度爲:【2 * 150】
return dl_dx
# 自己實現對權衡係數c求導數
def differential_c(x, y, w, b):
"""
:param x: 維度爲: 150 * 2
:param y: 維度爲: 150 * 1
:param w: 權重矩陣: 150 * 3
:param b: 偏置向量: 3 * 1
:return:
"""
n = np.size(x, 1) # n表示:樣本數150
k = np.size(x, 0) # k表示:每個樣本的特徵數2
c = len(b) # c表示:類別數
e = np.ones([n, 1], dtype=np.float) # 全1列向量
lam = 1. # 懲罰係數爲:1.
j1 = np.trace(x.T @ x) - (1 / n) * (e.T @ x.T @ x @ e) + (n - 1) / lam
dj1 = -(n-1) / (lam**2)
j2 = np.trace(w.T @ w) + lam * np.sum(np.square(np.maximum(np.zeros([c, n]), 1 - y * (w.T @ x + b))))
dj2 = np.sum(np.square(np.maximum(np.zeros([c, n]), 1 - y * (w.T @ x + b))))
dl_dc = (dj1 * j2 + j1 * dj2).take(0)
return dl_dc
def iris_type(s):
it = {b'Iris-setosa': 0, b'Iris-versicolor': 1, b'Iris-virginica': 2} # b'Iris-virginica': 2
return it[s]
def convert_to_one_hot(y, C):
return np.eye(C)[y.reshape(-1)]
def main():
data = np.loadtxt('./Iris.txt', dtype=float, delimiter=',', converters={4: iris_type}) # 數據集加載
x = data[:, :2]
x = tf.cast(x, tf.int32).numpy()
y = data[:, 4]
y = y.astype(np.int)
y_onehot = convert_to_one_hot(y, 3)
y_onehot[y_onehot == 0] = -1
x1 = np.transpose(x) # k * n k: 特徵維度, n: 樣本數
y1_onehot = np.transpose(y_onehot) # c * n c: 類別數, n: 樣本數
w = np.array([[1, 1, 1], # k * c k: 特徵維度, c: 類別數
[1, 1, 1]])
b = np.array([[1], # c*1 c: 類別數
[1],
[1]])
# 自己對x求導得到的結果
dx = differential_x(x1, y1_onehot, w, b)
print("自己求導得到的結果:\n", dx.T)
# 自己對c求導得到的結果
dc = differential_c(x1, y1_onehot, w, b)
a = 1
x = tf.cast(tf.convert_to_tensor(x), tf.float32)
# tensorflow對x自動微分得出的結果
c = tf.Variable(tf.constant(1.))
with tf.GradientTape() as tape:
tape.watch([x, c])
# 偏置b
b1 = tf.cast(tf.tile(b, [50, 3]), tf.float32)
########################################### 版本3: 公式推導3! ###################################################
center = tf.reduce_mean(x, 0)
dist = tf.reduce_sum(tf.square(x - center), 1)
d_2 = tf.reduce_sum(dist) + (1 / c) * (150 - 1)
out = x @ w + b1
############################################### SVM部分 #######################################################
regularization_loss = tf.cast(tf.reduce_sum(tf.square(w)), tf.float32)
# 0先按照每列(每類超平面法向量);
w_square = tf.square(tf.maximum(tf.zeros([150, 3]), 1 - y_onehot * out))
# 0表示按照每列(就是每個類別;所有樣本加權),然後所有類別求個平均;
hinge_loss = tf.reduce_sum(w_square)
with tf.name_scope('loss'):
loss = d_2 * (0.5 * regularization_loss + c * hinge_loss)
dloss_dx, dloss_dc = tape.gradient(loss, [x, c])
print("tensorflow對x自動微分得出的結果爲:\n", dloss_dx.numpy())
print('自己對c求導得到的結果:\n', dc)
print("tensorflow對c自動微分得出的結果爲:\n", dloss_dc.numpy())
if __name__ == '__main__':
main()
3.3. 實驗結果對比
C:\Anaconda3\envs\tf2\python.exe E:/Codes/MyCodes/TF1/svm_test/testautodiff.py
自己求導得到的結果:
[[-11688.05333333 33404.58666667]
[-73811.44 32217.2 ]
[-73811.44 32217.2 ]
[-73811.44 32217.2 ]
[-11688.05333333 33404.58666667]
[-11688.05333333 33404.58666667]
[-73811.44 32217.2 ]
[-11688.05333333 33404.58666667]
[-74998.82666667 -29906.18666667]
[-73811.44 32217.2 ]
[-11688.05333333 33404.58666667]
[-73811.44 32217.2 ]
[-73811.44 32217.2 ]
[-73811.44 32217.2 ]
[-10500.66666667 95527.97333333]
[-10500.66666667 95527.97333333]
[-11688.05333333 33404.58666667]
[-11688.05333333 33404.58666667]
[-11688.05333333 33404.58666667]
[-11688.05333333 33404.58666667]
[-11688.05333333 33404.58666667]
[-11688.05333333 33404.58666667]
[-73811.44 32217.2 ]
[-11688.05333333 33404.58666667]
[-73811.44 32217.2 ]
[-11688.05333333 33404.58666667]
[-11688.05333333 33404.58666667]
[-11688.05333333 33404.58666667]
[-11688.05333333 33404.58666667]
[-73811.44 32217.2 ]
[-73811.44 32217.2 ]
[-11688.05333333 33404.58666667]
[-10500.66666667 95527.97333333]
[-10500.66666667 95527.97333333]
[-73811.44 32217.2 ]
[-11688.05333333 33404.58666667]
[-11688.05333333 33404.58666667]
[-73811.44 32217.2 ]
[-73811.44 32217.2 ]
[-11688.05333333 33404.58666667]
[-11688.05333333 33404.58666667]
[-74998.82666667 -29906.18666667]
[-73811.44 32217.2 ]
[-11688.05333333 33404.58666667]
[-11688.05333333 33404.58666667]
[-73811.44 32217.2 ]
[-11688.05333333 33404.58666667]
[-73811.44 32217.2 ]
[-11688.05333333 33404.58666667]
[-11688.05333333 33404.58666667]
[112558.72 35779.36 ]
[ 50435.33333333 34591.97333333]
[ 50435.33333333 34591.97333333]
[-12875.44 -28718.8 ]
[ 49247.94666667 -27531.41333333]
[-12875.44 -28718.8 ]
[ 50435.33333333 34591.97333333]
[-74998.82666667 -29906.18666667]
[ 49247.94666667 -27531.41333333]
[-12875.44 -28718.8 ]
[-12875.44 -28718.8 ]
[-11688.05333333 33404.58666667]
[ 49247.94666667 -27531.41333333]
[ 49247.94666667 -27531.41333333]
[-12875.44 -28718.8 ]
[ 50435.33333333 34591.97333333]
[-11688.05333333 33404.58666667]
[-12875.44 -28718.8 ]
[ 49247.94666667 -27531.41333333]
[-12875.44 -28718.8 ]
[-11688.05333333 33404.58666667]
[ 49247.94666667 -27531.41333333]
[ 49247.94666667 -27531.41333333]
[ 49247.94666667 -27531.41333333]
[ 49247.94666667 -27531.41333333]
[ 50435.33333333 34591.97333333]
[ 49247.94666667 -27531.41333333]
[ 50435.33333333 34591.97333333]
[ 49247.94666667 -27531.41333333]
[-12875.44 -28718.8 ]
[-12875.44 -28718.8 ]
[-12875.44 -28718.8 ]
[-12875.44 -28718.8 ]
[ 49247.94666667 -27531.41333333]
[-11688.05333333 33404.58666667]
[ 50435.33333333 34591.97333333]
[ 50435.33333333 34591.97333333]
[ 49247.94666667 -27531.41333333]
[-11688.05333333 33404.58666667]
[-12875.44 -28718.8 ]
[-12875.44 -28718.8 ]
[ 50435.33333333 34591.97333333]
[-12875.44 -28718.8 ]
[-12875.44 -28718.8 ]
[-12875.44 -28718.8 ]
[-11688.05333333 33404.58666667]
[-12875.44 -28718.8 ]
[ 49247.94666667 -27531.41333333]
[-12875.44 -28718.8 ]
[-12875.44 -28718.8 ]
[ 50435.33333333 34591.97333333]
[-12875.44 -28718.8 ]
[112558.72 35779.36 ]
[ 49247.94666667 -27531.41333333]
[ 50435.33333333 34591.97333333]
[112558.72 35779.36 ]
[-74998.82666667 -29906.18666667]
[111371.33333333 -26344.02666667]
[ 49247.94666667 -27531.41333333]
[112558.72 35779.36 ]
[ 50435.33333333 34591.97333333]
[ 49247.94666667 -27531.41333333]
[ 50435.33333333 34591.97333333]
[-12875.44 -28718.8 ]
[-12875.44 -28718.8 ]
[ 50435.33333333 34591.97333333]
[ 50435.33333333 34591.97333333]
[112558.72 35779.36 ]
[111371.33333333 -26344.02666667]
[ 49247.94666667 -27531.41333333]
[ 50435.33333333 34591.97333333]
[-12875.44 -28718.8 ]
[111371.33333333 -26344.02666667]
[ 49247.94666667 -27531.41333333]
[ 50435.33333333 34591.97333333]
[112558.72 35779.36 ]
[ 49247.94666667 -27531.41333333]
[ 50435.33333333 34591.97333333]
[ 49247.94666667 -27531.41333333]
[112558.72 35779.36 ]
[111371.33333333 -26344.02666667]
[112558.72 35779.36 ]
[ 49247.94666667 -27531.41333333]
[ 49247.94666667 -27531.41333333]
[ 49247.94666667 -27531.41333333]
[112558.72 35779.36 ]
[ 50435.33333333 34591.97333333]
[ 50435.33333333 34591.97333333]
[ 50435.33333333 34591.97333333]
[ 50435.33333333 34591.97333333]
[ 50435.33333333 34591.97333333]
[ 50435.33333333 34591.97333333]
[-12875.44 -28718.8 ]
[ 50435.33333333 34591.97333333]
[ 50435.33333333 34591.97333333]
[ 50435.33333333 34591.97333333]
[ 49247.94666667 -27531.41333333]
[ 50435.33333333 34591.97333333]
[ 50435.33333333 34591.97333333]
[-11688.05333333 33404.58666667]]
tensorflow對x自動微分得出的結果爲:
[[-11685.731 33402.47 ]
[-73803.12 32215.08 ]
[-73803.12 32215.08 ]
[-73803.12 32215.08 ]
[-11685.731 33402.47 ]
[-11685.731 33402.47 ]
[-73803.12 32215.08 ]
[-11685.731 33402.47 ]
[-74990.51 -29902.307]
[-73803.12 32215.08 ]
[-11685.731 33402.47 ]
[-73803.12 32215.08 ]
[-73803.12 32215.08 ]
[-73803.12 32215.08 ]
[-10498.345 95519.85 ]
[-10498.345 95519.85 ]
[-11685.731 33402.47 ]
[-11685.731 33402.47 ]
[-11685.731 33402.47 ]
[-11685.731 33402.47 ]
[-11685.731 33402.47 ]
[-11685.731 33402.47 ]
[-73803.12 32215.08 ]
[-11685.731 33402.47 ]
[-73803.12 32215.08 ]
[-11685.731 33402.47 ]
[-11685.731 33402.47 ]
[-11685.731 33402.47 ]
[-11685.731 33402.47 ]
[-73803.12 32215.08 ]
[-73803.12 32215.08 ]
[-11685.731 33402.47 ]
[-10498.345 95519.85 ]
[-10498.345 95519.85 ]
[-73803.12 32215.08 ]
[-11685.731 33402.47 ]
[-11685.731 33402.47 ]
[-73803.12 32215.08 ]
[-73803.12 32215.08 ]
[-11685.731 33402.47 ]
[-11685.731 33402.47 ]
[-74990.51 -29902.307]
[-73803.12 32215.08 ]
[-11685.731 33402.47 ]
[-11685.731 33402.47 ]
[-73803.12 32215.08 ]
[-11685.731 33402.47 ]
[-73803.12 32215.08 ]
[-11685.731 33402.47 ]
[-11685.731 33402.47 ]
[112549.04 35777.242]
[ 50431.656 34589.855]
[ 50431.656 34589.855]
[-12873.118 -28714.92 ]
[ 49244.27 -27527.533]
[-12873.118 -28714.92 ]
[ 50431.656 34589.855]
[-74990.51 -29902.307]
[ 49244.27 -27527.533]
[-12873.118 -28714.92 ]
[-12873.118 -28714.92 ]
[-11685.731 33402.47 ]
[ 49244.27 -27527.533]
[ 49244.27 -27527.533]
[-12873.118 -28714.92 ]
[ 50431.656 34589.855]
[-11685.731 33402.47 ]
[-12873.118 -28714.92 ]
[ 49244.27 -27527.533]
[-12873.118 -28714.92 ]
[-11685.731 33402.47 ]
[ 49244.27 -27527.533]
[ 49244.27 -27527.533]
[ 49244.27 -27527.533]
[ 49244.27 -27527.533]
[ 50431.656 34589.855]
[ 49244.27 -27527.533]
[ 50431.656 34589.855]
[ 49244.27 -27527.533]
[-12873.118 -28714.92 ]
[-12873.118 -28714.92 ]
[-12873.118 -28714.92 ]
[-12873.118 -28714.92 ]
[ 49244.27 -27527.533]
[-11685.731 33402.47 ]
[ 50431.656 34589.855]
[ 50431.656 34589.855]
[ 49244.27 -27527.533]
[-11685.731 33402.47 ]
[-12873.118 -28714.92 ]
[-12873.118 -28714.92 ]
[ 50431.656 34589.855]
[-12873.118 -28714.92 ]
[-12873.118 -28714.92 ]
[-12873.118 -28714.92 ]
[-11685.731 33402.47 ]
[-12873.118 -28714.92 ]
[ 49244.27 -27527.533]
[-12873.118 -28714.92 ]
[-12873.118 -28714.92 ]
[ 50431.656 34589.855]
[-12873.118 -28714.92 ]
[112549.04 35777.242]
[ 49244.27 -27527.533]
[ 50431.656 34589.855]
[112549.04 35777.242]
[-74990.51 -29902.307]
[111361.65 -26340.146]
[ 49244.27 -27527.533]
[112549.04 35777.242]
[ 50431.656 34589.855]
[ 49244.27 -27527.533]
[ 50431.656 34589.855]
[-12873.118 -28714.92 ]
[-12873.118 -28714.92 ]
[ 50431.656 34589.855]
[ 50431.656 34589.855]
[112549.04 35777.242]
[111361.65 -26340.146]
[ 49244.27 -27527.533]
[ 50431.656 34589.855]
[-12873.118 -28714.92 ]
[111361.65 -26340.146]
[ 49244.27 -27527.533]
[ 50431.656 34589.855]
[112549.04 35777.242]
[ 49244.27 -27527.533]
[ 50431.656 34589.855]
[ 49244.27 -27527.533]
[112549.04 35777.242]
[111361.65 -26340.146]
[112549.04 35777.242]
[ 49244.27 -27527.533]
[ 49244.27 -27527.533]
[ 49244.27 -27527.533]
[112549.04 35777.242]
[ 50431.656 34589.855]
[ 50431.656 34589.855]
[ 50431.656 34589.855]
[ 50431.656 34589.855]
[ 50431.656 34589.855]
[ 50431.656 34589.855]
[-12873.118 -28714.92 ]
[ 50431.656 34589.855]
[ 50431.656 34589.855]
[ 50431.656 34589.855]
[ 49244.27 -27527.533]
[ 50431.656 34589.855]
[ 50431.656 34589.855]
[-11685.731 33402.47 ]]
自己對c求導得到的結果:
4502811.159999991
tensorflow對c自動微分得出的結果爲:
4503259.0
Process finished with exit code 0
總結:通過實驗結果可以發現TensorFlow自動求導的結果和我們自己求導的結果非常的接近。