TensorFlow2.0自动微分和手工求导的结果对比! |
一. 提出模型
- 本文对比了TensorFlow自动微分机制和手工求导的结果。
二. 模型求导
三. 实验分析
3.1. 鸢尾花数据集Iris.txt
5.1,3.5,1.4,0.2,Iris-setosa
4.9,3.0,1.4,0.2,Iris-setosa
4.7,3.2,1.3,0.2,Iris-setosa
4.6,3.1,1.5,0.2,Iris-setosa
5.0,3.6,1.4,0.2,Iris-setosa
5.4,3.9,1.7,0.4,Iris-setosa
4.6,3.4,1.4,0.3,Iris-setosa
5.0,3.4,1.5,0.2,Iris-setosa
4.4,2.9,1.4,0.2,Iris-setosa
4.9,3.1,1.5,0.1,Iris-setosa
5.4,3.7,1.5,0.2,Iris-setosa
4.8,3.4,1.6,0.2,Iris-setosa
4.8,3.0,1.4,0.1,Iris-setosa
4.3,3.0,1.1,0.1,Iris-setosa
5.8,4.0,1.2,0.2,Iris-setosa
5.7,4.4,1.5,0.4,Iris-setosa
5.4,3.9,1.3,0.4,Iris-setosa
5.1,3.5,1.4,0.3,Iris-setosa
5.7,3.8,1.7,0.3,Iris-setosa
5.1,3.8,1.5,0.3,Iris-setosa
5.4,3.4,1.7,0.2,Iris-setosa
5.1,3.7,1.5,0.4,Iris-setosa
4.6,3.6,1.0,0.2,Iris-setosa
5.1,3.3,1.7,0.5,Iris-setosa
4.8,3.4,1.9,0.2,Iris-setosa
5.0,3.0,1.6,0.2,Iris-setosa
5.0,3.4,1.6,0.4,Iris-setosa
5.2,3.5,1.5,0.2,Iris-setosa
5.2,3.4,1.4,0.2,Iris-setosa
4.7,3.2,1.6,0.2,Iris-setosa
4.8,3.1,1.6,0.2,Iris-setosa
5.4,3.4,1.5,0.4,Iris-setosa
5.2,4.1,1.5,0.1,Iris-setosa
5.5,4.2,1.4,0.2,Iris-setosa
4.9,3.1,1.5,0.1,Iris-setosa
5.0,3.2,1.2,0.2,Iris-setosa
5.5,3.5,1.3,0.2,Iris-setosa
4.9,3.1,1.5,0.1,Iris-setosa
4.4,3.0,1.3,0.2,Iris-setosa
5.1,3.4,1.5,0.2,Iris-setosa
5.0,3.5,1.3,0.3,Iris-setosa
4.5,2.3,1.3,0.3,Iris-setosa
4.4,3.2,1.3,0.2,Iris-setosa
5.0,3.5,1.6,0.6,Iris-setosa
5.1,3.8,1.9,0.4,Iris-setosa
4.8,3.0,1.4,0.3,Iris-setosa
5.1,3.8,1.6,0.2,Iris-setosa
4.6,3.2,1.4,0.2,Iris-setosa
5.3,3.7,1.5,0.2,Iris-setosa
5.0,3.3,1.4,0.2,Iris-setosa
7.0,3.2,4.7,1.4,Iris-versicolor
6.4,3.2,4.5,1.5,Iris-versicolor
6.9,3.1,4.9,1.5,Iris-versicolor
5.5,2.3,4.0,1.3,Iris-versicolor
6.5,2.8,4.6,1.5,Iris-versicolor
5.7,2.8,4.5,1.3,Iris-versicolor
6.3,3.3,4.7,1.6,Iris-versicolor
4.9,2.4,3.3,1.0,Iris-versicolor
6.6,2.9,4.6,1.3,Iris-versicolor
5.2,2.7,3.9,1.4,Iris-versicolor
5.0,2.0,3.5,1.0,Iris-versicolor
5.9,3.0,4.2,1.5,Iris-versicolor
6.0,2.2,4.0,1.0,Iris-versicolor
6.1,2.9,4.7,1.4,Iris-versicolor
5.6,2.9,3.6,1.3,Iris-versicolor
6.7,3.1,4.4,1.4,Iris-versicolor
5.6,3.0,4.5,1.5,Iris-versicolor
5.8,2.7,4.1,1.0,Iris-versicolor
6.2,2.2,4.5,1.5,Iris-versicolor
5.6,2.5,3.9,1.1,Iris-versicolor
5.9,3.2,4.8,1.8,Iris-versicolor
6.1,2.8,4.0,1.3,Iris-versicolor
6.3,2.5,4.9,1.5,Iris-versicolor
6.1,2.8,4.7,1.2,Iris-versicolor
6.4,2.9,4.3,1.3,Iris-versicolor
6.6,3.0,4.4,1.4,Iris-versicolor
6.8,2.8,4.8,1.4,Iris-versicolor
6.7,3.0,5.0,1.7,Iris-versicolor
6.0,2.9,4.5,1.5,Iris-versicolor
5.7,2.6,3.5,1.0,Iris-versicolor
5.5,2.4,3.8,1.1,Iris-versicolor
5.5,2.4,3.7,1.0,Iris-versicolor
5.8,2.7,3.9,1.2,Iris-versicolor
6.0,2.7,5.1,1.6,Iris-versicolor
5.4,3.0,4.5,1.5,Iris-versicolor
6.0,3.4,4.5,1.6,Iris-versicolor
6.7,3.1,4.7,1.5,Iris-versicolor
6.3,2.3,4.4,1.3,Iris-versicolor
5.6,3.0,4.1,1.3,Iris-versicolor
5.5,2.5,4.0,1.3,Iris-versicolor
5.5,2.6,4.4,1.2,Iris-versicolor
6.1,3.0,4.6,1.4,Iris-versicolor
5.8,2.6,4.0,1.2,Iris-versicolor
5.0,2.3,3.3,1.0,Iris-versicolor
5.6,2.7,4.2,1.3,Iris-versicolor
5.7,3.0,4.2,1.2,Iris-versicolor
5.7,2.9,4.2,1.3,Iris-versicolor
6.2,2.9,4.3,1.3,Iris-versicolor
5.1,2.5,3.0,1.1,Iris-versicolor
5.7,2.8,4.1,1.3,Iris-versicolor
6.3,3.3,6.0,2.5,Iris-virginica
5.8,2.7,5.1,1.9,Iris-virginica
7.1,3.0,5.9,2.1,Iris-virginica
6.3,2.9,5.6,1.8,Iris-virginica
6.5,3.0,5.8,2.2,Iris-virginica
7.6,3.0,6.6,2.1,Iris-virginica
4.9,2.5,4.5,1.7,Iris-virginica
7.3,2.9,6.3,1.8,Iris-virginica
6.7,2.5,5.8,1.8,Iris-virginica
7.2,3.6,6.1,2.5,Iris-virginica
6.5,3.2,5.1,2.0,Iris-virginica
6.4,2.7,5.3,1.9,Iris-virginica
6.8,3.0,5.5,2.1,Iris-virginica
5.7,2.5,5.0,2.0,Iris-virginica
5.8,2.8,5.1,2.4,Iris-virginica
6.4,3.2,5.3,2.3,Iris-virginica
6.5,3.0,5.5,1.8,Iris-virginica
7.7,3.8,6.7,2.2,Iris-virginica
7.7,2.6,6.9,2.3,Iris-virginica
6.0,2.2,5.0,1.5,Iris-virginica
6.9,3.2,5.7,2.3,Iris-virginica
5.6,2.8,4.9,2.0,Iris-virginica
7.7,2.8,6.7,2.0,Iris-virginica
6.3,2.7,4.9,1.8,Iris-virginica
6.7,3.3,5.7,2.1,Iris-virginica
7.2,3.2,6.0,1.8,Iris-virginica
6.2,2.8,4.8,1.8,Iris-virginica
6.1,3.0,4.9,1.8,Iris-virginica
6.4,2.8,5.6,2.1,Iris-virginica
7.2,3.0,5.8,1.6,Iris-virginica
7.4,2.8,6.1,1.9,Iris-virginica
7.9,3.8,6.4,2.0,Iris-virginica
6.4,2.8,5.6,2.2,Iris-virginica
6.3,2.8,5.1,1.5,Iris-virginica
6.1,2.6,5.6,1.4,Iris-virginica
7.7,3.0,6.1,2.3,Iris-virginica
6.3,3.4,5.6,2.4,Iris-virginica
6.4,3.1,5.5,1.8,Iris-virginica
6.0,3.0,4.8,1.8,Iris-virginica
6.9,3.1,5.4,2.1,Iris-virginica
6.7,3.1,5.6,2.4,Iris-virginica
6.9,3.1,5.1,2.3,Iris-virginica
5.8,2.7,5.1,1.9,Iris-virginica
6.8,3.2,5.9,2.3,Iris-virginica
6.7,3.3,5.7,2.5,Iris-virginica
6.7,3.0,5.2,2.3,Iris-virginica
6.3,2.5,5.0,1.9,Iris-virginica
6.5,3.0,5.2,2.0,Iris-virginica
6.2,3.4,5.4,2.3,Iris-virginica
5.9,3.0,5.1,1.8,Iris-virginica
3.2. Python代码实现
import numpy as np
import tensorflow as tf
import os
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'
batchsz = 150
# 自己实现对x求导数
def differential_x(x, y, w, b):
"""
:param x: 维度为: 150 * 2
:param y: 维度为: 150 * 1
:param w: 权重矩阵: 150 * 3
:param b: 偏置向量: 3 * 1
:return:
"""
n = np.size(x, 1) # n表示:样本数150
k = np.size(x, 0) # k表示:每个样本的特征数2
c = len(b) # c表示:类别数
e = np.ones([n, 1], dtype=np.float) # 全1列向量
ksi = np.zeros([c, n], dtype=np.float) # 和标签维度一样:3 * 150;代表:1 - yi * (w.T @ x + b)
lam = 1. # 惩罚系数为:1.
dj2 = np.zeros([k, n], dtype=np.float) # 倒数的格式为:k * n
for i in range(n): # 遍历样本,分别对于每个样本进行。
xi = np.reshape(x[:, i], [k, 1]) # 第 i 个样本
yi = np.reshape(y[:, i], [c, 1]) # 第 i 个标签
# 结果ksii维度为:[3, 2] @ [2, 1] => [3, 1] 正好是三类。可C。
ksii = (w.T @ xi + b) * yi # w维度: k * c;这里为:2 * 3
# 分2种情况进行讨论!
loss_index = np.reshape(ksii < 1, [c, ]) # 找出小于1的位置座标,小于1为false,否则为True; eg: [False True True]
# ksi表示: 1 - yi(w.T @ x + b)
ksi[:, i] = 1 - ksii[:, 0]
# 第一种情况:yi * (w.T @ x + b) < 1,即是:1 - yi * (w.T @ x + b) < 0;为真True,也就是一个样本的3个类别输出的都满足。可以理解为硬间隔,hinge loss=0
if 0 not in loss_index:
continue # 跳出本次for循环。
# 第二种情况:yi * (w.T @ x + b) < 1,即是:1 - yi * (w.T @ x + b) >= 0; 为假False,存在hinge loss 不为0
else:
yc = yi[loss_index, :] # 找出样本中为True的类别
bc = b[loss_index] # 找出为True对应的偏置b
wc = w[:, loss_index] # 找出为True对应的超平面wc
# a1 = wc @ bc
# a2 = wc @ yc
a1 = 2 * lam * ((wc @ wc.T @ xi) + (wc @ bc) - wc @ yc)
dj2[:, i] = 2 * lam * ((wc @ wc.T @ xi) + (wc @ bc) - wc @ yc)[:, 0]
# 跳出for循环
ksi[ksi < 0] = 0
j1 = np.trace(x.T @ x) - (1 / n) * (e.T @ x.T @ x @ e) + (n - 1) / lam
dj1 = 2 * x - (2 / n) * x @ e @ e.T
j2 = np.trace(w.T @ w ) + np.sum(np.square(ksi))
dl_dx = dj1 * j2 + j1 * dj2
# 返回的维度为:【2 * 150】
return dl_dx
# 自己实现对权衡系数c求导数
def differential_c(x, y, w, b):
"""
:param x: 维度为: 150 * 2
:param y: 维度为: 150 * 1
:param w: 权重矩阵: 150 * 3
:param b: 偏置向量: 3 * 1
:return:
"""
n = np.size(x, 1) # n表示:样本数150
k = np.size(x, 0) # k表示:每个样本的特征数2
c = len(b) # c表示:类别数
e = np.ones([n, 1], dtype=np.float) # 全1列向量
lam = 1. # 惩罚系数为:1.
j1 = np.trace(x.T @ x) - (1 / n) * (e.T @ x.T @ x @ e) + (n - 1) / lam
dj1 = -(n-1) / (lam**2)
j2 = np.trace(w.T @ w) + lam * np.sum(np.square(np.maximum(np.zeros([c, n]), 1 - y * (w.T @ x + b))))
dj2 = np.sum(np.square(np.maximum(np.zeros([c, n]), 1 - y * (w.T @ x + b))))
dl_dc = (dj1 * j2 + j1 * dj2).take(0)
return dl_dc
def iris_type(s):
it = {b'Iris-setosa': 0, b'Iris-versicolor': 1, b'Iris-virginica': 2} # b'Iris-virginica': 2
return it[s]
def convert_to_one_hot(y, C):
return np.eye(C)[y.reshape(-1)]
def main():
data = np.loadtxt('./Iris.txt', dtype=float, delimiter=',', converters={4: iris_type}) # 数据集加载
x = data[:, :2]
x = tf.cast(x, tf.int32).numpy()
y = data[:, 4]
y = y.astype(np.int)
y_onehot = convert_to_one_hot(y, 3)
y_onehot[y_onehot == 0] = -1
x1 = np.transpose(x) # k * n k: 特征维度, n: 样本数
y1_onehot = np.transpose(y_onehot) # c * n c: 类别数, n: 样本数
w = np.array([[1, 1, 1], # k * c k: 特征维度, c: 类别数
[1, 1, 1]])
b = np.array([[1], # c*1 c: 类别数
[1],
[1]])
# 自己对x求导得到的结果
dx = differential_x(x1, y1_onehot, w, b)
print("自己求导得到的结果:\n", dx.T)
# 自己对c求导得到的结果
dc = differential_c(x1, y1_onehot, w, b)
a = 1
x = tf.cast(tf.convert_to_tensor(x), tf.float32)
# tensorflow对x自动微分得出的结果
c = tf.Variable(tf.constant(1.))
with tf.GradientTape() as tape:
tape.watch([x, c])
# 偏置b
b1 = tf.cast(tf.tile(b, [50, 3]), tf.float32)
########################################### 版本3: 公式推导3! ###################################################
center = tf.reduce_mean(x, 0)
dist = tf.reduce_sum(tf.square(x - center), 1)
d_2 = tf.reduce_sum(dist) + (1 / c) * (150 - 1)
out = x @ w + b1
############################################### SVM部分 #######################################################
regularization_loss = tf.cast(tf.reduce_sum(tf.square(w)), tf.float32)
# 0先按照每列(每类超平面法向量);
w_square = tf.square(tf.maximum(tf.zeros([150, 3]), 1 - y_onehot * out))
# 0表示按照每列(就是每个类别;所有样本加权),然后所有类别求个平均;
hinge_loss = tf.reduce_sum(w_square)
with tf.name_scope('loss'):
loss = d_2 * (0.5 * regularization_loss + c * hinge_loss)
dloss_dx, dloss_dc = tape.gradient(loss, [x, c])
print("tensorflow对x自动微分得出的结果为:\n", dloss_dx.numpy())
print('自己对c求导得到的结果:\n', dc)
print("tensorflow对c自动微分得出的结果为:\n", dloss_dc.numpy())
if __name__ == '__main__':
main()
3.3. 实验结果对比
C:\Anaconda3\envs\tf2\python.exe E:/Codes/MyCodes/TF1/svm_test/testautodiff.py
自己求导得到的结果:
[[-11688.05333333 33404.58666667]
[-73811.44 32217.2 ]
[-73811.44 32217.2 ]
[-73811.44 32217.2 ]
[-11688.05333333 33404.58666667]
[-11688.05333333 33404.58666667]
[-73811.44 32217.2 ]
[-11688.05333333 33404.58666667]
[-74998.82666667 -29906.18666667]
[-73811.44 32217.2 ]
[-11688.05333333 33404.58666667]
[-73811.44 32217.2 ]
[-73811.44 32217.2 ]
[-73811.44 32217.2 ]
[-10500.66666667 95527.97333333]
[-10500.66666667 95527.97333333]
[-11688.05333333 33404.58666667]
[-11688.05333333 33404.58666667]
[-11688.05333333 33404.58666667]
[-11688.05333333 33404.58666667]
[-11688.05333333 33404.58666667]
[-11688.05333333 33404.58666667]
[-73811.44 32217.2 ]
[-11688.05333333 33404.58666667]
[-73811.44 32217.2 ]
[-11688.05333333 33404.58666667]
[-11688.05333333 33404.58666667]
[-11688.05333333 33404.58666667]
[-11688.05333333 33404.58666667]
[-73811.44 32217.2 ]
[-73811.44 32217.2 ]
[-11688.05333333 33404.58666667]
[-10500.66666667 95527.97333333]
[-10500.66666667 95527.97333333]
[-73811.44 32217.2 ]
[-11688.05333333 33404.58666667]
[-11688.05333333 33404.58666667]
[-73811.44 32217.2 ]
[-73811.44 32217.2 ]
[-11688.05333333 33404.58666667]
[-11688.05333333 33404.58666667]
[-74998.82666667 -29906.18666667]
[-73811.44 32217.2 ]
[-11688.05333333 33404.58666667]
[-11688.05333333 33404.58666667]
[-73811.44 32217.2 ]
[-11688.05333333 33404.58666667]
[-73811.44 32217.2 ]
[-11688.05333333 33404.58666667]
[-11688.05333333 33404.58666667]
[112558.72 35779.36 ]
[ 50435.33333333 34591.97333333]
[ 50435.33333333 34591.97333333]
[-12875.44 -28718.8 ]
[ 49247.94666667 -27531.41333333]
[-12875.44 -28718.8 ]
[ 50435.33333333 34591.97333333]
[-74998.82666667 -29906.18666667]
[ 49247.94666667 -27531.41333333]
[-12875.44 -28718.8 ]
[-12875.44 -28718.8 ]
[-11688.05333333 33404.58666667]
[ 49247.94666667 -27531.41333333]
[ 49247.94666667 -27531.41333333]
[-12875.44 -28718.8 ]
[ 50435.33333333 34591.97333333]
[-11688.05333333 33404.58666667]
[-12875.44 -28718.8 ]
[ 49247.94666667 -27531.41333333]
[-12875.44 -28718.8 ]
[-11688.05333333 33404.58666667]
[ 49247.94666667 -27531.41333333]
[ 49247.94666667 -27531.41333333]
[ 49247.94666667 -27531.41333333]
[ 49247.94666667 -27531.41333333]
[ 50435.33333333 34591.97333333]
[ 49247.94666667 -27531.41333333]
[ 50435.33333333 34591.97333333]
[ 49247.94666667 -27531.41333333]
[-12875.44 -28718.8 ]
[-12875.44 -28718.8 ]
[-12875.44 -28718.8 ]
[-12875.44 -28718.8 ]
[ 49247.94666667 -27531.41333333]
[-11688.05333333 33404.58666667]
[ 50435.33333333 34591.97333333]
[ 50435.33333333 34591.97333333]
[ 49247.94666667 -27531.41333333]
[-11688.05333333 33404.58666667]
[-12875.44 -28718.8 ]
[-12875.44 -28718.8 ]
[ 50435.33333333 34591.97333333]
[-12875.44 -28718.8 ]
[-12875.44 -28718.8 ]
[-12875.44 -28718.8 ]
[-11688.05333333 33404.58666667]
[-12875.44 -28718.8 ]
[ 49247.94666667 -27531.41333333]
[-12875.44 -28718.8 ]
[-12875.44 -28718.8 ]
[ 50435.33333333 34591.97333333]
[-12875.44 -28718.8 ]
[112558.72 35779.36 ]
[ 49247.94666667 -27531.41333333]
[ 50435.33333333 34591.97333333]
[112558.72 35779.36 ]
[-74998.82666667 -29906.18666667]
[111371.33333333 -26344.02666667]
[ 49247.94666667 -27531.41333333]
[112558.72 35779.36 ]
[ 50435.33333333 34591.97333333]
[ 49247.94666667 -27531.41333333]
[ 50435.33333333 34591.97333333]
[-12875.44 -28718.8 ]
[-12875.44 -28718.8 ]
[ 50435.33333333 34591.97333333]
[ 50435.33333333 34591.97333333]
[112558.72 35779.36 ]
[111371.33333333 -26344.02666667]
[ 49247.94666667 -27531.41333333]
[ 50435.33333333 34591.97333333]
[-12875.44 -28718.8 ]
[111371.33333333 -26344.02666667]
[ 49247.94666667 -27531.41333333]
[ 50435.33333333 34591.97333333]
[112558.72 35779.36 ]
[ 49247.94666667 -27531.41333333]
[ 50435.33333333 34591.97333333]
[ 49247.94666667 -27531.41333333]
[112558.72 35779.36 ]
[111371.33333333 -26344.02666667]
[112558.72 35779.36 ]
[ 49247.94666667 -27531.41333333]
[ 49247.94666667 -27531.41333333]
[ 49247.94666667 -27531.41333333]
[112558.72 35779.36 ]
[ 50435.33333333 34591.97333333]
[ 50435.33333333 34591.97333333]
[ 50435.33333333 34591.97333333]
[ 50435.33333333 34591.97333333]
[ 50435.33333333 34591.97333333]
[ 50435.33333333 34591.97333333]
[-12875.44 -28718.8 ]
[ 50435.33333333 34591.97333333]
[ 50435.33333333 34591.97333333]
[ 50435.33333333 34591.97333333]
[ 49247.94666667 -27531.41333333]
[ 50435.33333333 34591.97333333]
[ 50435.33333333 34591.97333333]
[-11688.05333333 33404.58666667]]
tensorflow对x自动微分得出的结果为:
[[-11685.731 33402.47 ]
[-73803.12 32215.08 ]
[-73803.12 32215.08 ]
[-73803.12 32215.08 ]
[-11685.731 33402.47 ]
[-11685.731 33402.47 ]
[-73803.12 32215.08 ]
[-11685.731 33402.47 ]
[-74990.51 -29902.307]
[-73803.12 32215.08 ]
[-11685.731 33402.47 ]
[-73803.12 32215.08 ]
[-73803.12 32215.08 ]
[-73803.12 32215.08 ]
[-10498.345 95519.85 ]
[-10498.345 95519.85 ]
[-11685.731 33402.47 ]
[-11685.731 33402.47 ]
[-11685.731 33402.47 ]
[-11685.731 33402.47 ]
[-11685.731 33402.47 ]
[-11685.731 33402.47 ]
[-73803.12 32215.08 ]
[-11685.731 33402.47 ]
[-73803.12 32215.08 ]
[-11685.731 33402.47 ]
[-11685.731 33402.47 ]
[-11685.731 33402.47 ]
[-11685.731 33402.47 ]
[-73803.12 32215.08 ]
[-73803.12 32215.08 ]
[-11685.731 33402.47 ]
[-10498.345 95519.85 ]
[-10498.345 95519.85 ]
[-73803.12 32215.08 ]
[-11685.731 33402.47 ]
[-11685.731 33402.47 ]
[-73803.12 32215.08 ]
[-73803.12 32215.08 ]
[-11685.731 33402.47 ]
[-11685.731 33402.47 ]
[-74990.51 -29902.307]
[-73803.12 32215.08 ]
[-11685.731 33402.47 ]
[-11685.731 33402.47 ]
[-73803.12 32215.08 ]
[-11685.731 33402.47 ]
[-73803.12 32215.08 ]
[-11685.731 33402.47 ]
[-11685.731 33402.47 ]
[112549.04 35777.242]
[ 50431.656 34589.855]
[ 50431.656 34589.855]
[-12873.118 -28714.92 ]
[ 49244.27 -27527.533]
[-12873.118 -28714.92 ]
[ 50431.656 34589.855]
[-74990.51 -29902.307]
[ 49244.27 -27527.533]
[-12873.118 -28714.92 ]
[-12873.118 -28714.92 ]
[-11685.731 33402.47 ]
[ 49244.27 -27527.533]
[ 49244.27 -27527.533]
[-12873.118 -28714.92 ]
[ 50431.656 34589.855]
[-11685.731 33402.47 ]
[-12873.118 -28714.92 ]
[ 49244.27 -27527.533]
[-12873.118 -28714.92 ]
[-11685.731 33402.47 ]
[ 49244.27 -27527.533]
[ 49244.27 -27527.533]
[ 49244.27 -27527.533]
[ 49244.27 -27527.533]
[ 50431.656 34589.855]
[ 49244.27 -27527.533]
[ 50431.656 34589.855]
[ 49244.27 -27527.533]
[-12873.118 -28714.92 ]
[-12873.118 -28714.92 ]
[-12873.118 -28714.92 ]
[-12873.118 -28714.92 ]
[ 49244.27 -27527.533]
[-11685.731 33402.47 ]
[ 50431.656 34589.855]
[ 50431.656 34589.855]
[ 49244.27 -27527.533]
[-11685.731 33402.47 ]
[-12873.118 -28714.92 ]
[-12873.118 -28714.92 ]
[ 50431.656 34589.855]
[-12873.118 -28714.92 ]
[-12873.118 -28714.92 ]
[-12873.118 -28714.92 ]
[-11685.731 33402.47 ]
[-12873.118 -28714.92 ]
[ 49244.27 -27527.533]
[-12873.118 -28714.92 ]
[-12873.118 -28714.92 ]
[ 50431.656 34589.855]
[-12873.118 -28714.92 ]
[112549.04 35777.242]
[ 49244.27 -27527.533]
[ 50431.656 34589.855]
[112549.04 35777.242]
[-74990.51 -29902.307]
[111361.65 -26340.146]
[ 49244.27 -27527.533]
[112549.04 35777.242]
[ 50431.656 34589.855]
[ 49244.27 -27527.533]
[ 50431.656 34589.855]
[-12873.118 -28714.92 ]
[-12873.118 -28714.92 ]
[ 50431.656 34589.855]
[ 50431.656 34589.855]
[112549.04 35777.242]
[111361.65 -26340.146]
[ 49244.27 -27527.533]
[ 50431.656 34589.855]
[-12873.118 -28714.92 ]
[111361.65 -26340.146]
[ 49244.27 -27527.533]
[ 50431.656 34589.855]
[112549.04 35777.242]
[ 49244.27 -27527.533]
[ 50431.656 34589.855]
[ 49244.27 -27527.533]
[112549.04 35777.242]
[111361.65 -26340.146]
[112549.04 35777.242]
[ 49244.27 -27527.533]
[ 49244.27 -27527.533]
[ 49244.27 -27527.533]
[112549.04 35777.242]
[ 50431.656 34589.855]
[ 50431.656 34589.855]
[ 50431.656 34589.855]
[ 50431.656 34589.855]
[ 50431.656 34589.855]
[ 50431.656 34589.855]
[-12873.118 -28714.92 ]
[ 50431.656 34589.855]
[ 50431.656 34589.855]
[ 50431.656 34589.855]
[ 49244.27 -27527.533]
[ 50431.656 34589.855]
[ 50431.656 34589.855]
[-11685.731 33402.47 ]]
自己对c求导得到的结果:
4502811.159999991
tensorflow对c自动微分得出的结果为:
4503259.0
Process finished with exit code 0
总结:通过实验结果可以发现TensorFlow自动求导的结果和我们自己求导的结果非常的接近。