tensorflow.control_dependencies( tasklist )可以把tasklist裏的操作作爲預定操作。一般是這樣使用:
with tensorflow.control_dependencies( tasklist ) :
trainOp = tensorflow.no_op()
然後執行這個 trainOp (即Tensorflow.Session().run( trainOp))時,就會把 tasklist所包含的動作先執行掉.
但是在with裏面不能直接賦值別的操作,那樣的話,就不再執行tasklist裏的動作了.比如有個動作 task1 ,它不在tasklist裏面,也與tasklist不相關.但是在with代碼段裏面將task1直接賦值給trainOP,那麼不會執行tasklist包含的動作:
with tensorflow.control_dependencies( tasklist ) :
trainOp = task1
可以這樣修改,將新加入的動作和no_動作一起放在list裏,達到執行tasklist裏包含的動作:
with tensorflow.control_dependencies( tasklist ) :
trainOp = tensorflow.no_op()
trainOp = [trainOp , task1]
下面是一段驗證代碼:
import tensorflow as tf
a = tf.Variable( 2 )
selfAdd = tf.Variable( 0 )
selfAddition = tf.assign_add( selfAdd , 3 )
selfSub = tf.Variable( 0 )
selfSubtraction = tf.assign_sub( selfSub , 2 )
b = tf.multiply( a , selfAdd )
with tf.control_dependencies( [ selfAddition ] ) :
train_op = tf.no_op() #待解釋語句1
print( train_op )
train_op =[ train_op , selfSubtraction] #待解釋語句2
print( train_op )
with tf.Session() as sess :
init = tf.global_variables_initializer()
sess.run( init )
for i in range( 20 ) :
sess.run( train_op ) #待解釋語句3
print( "selfAdd:" , sess.run( selfAdd ) )
ra = sess.run(selfAdd )
rb = sess.run(b )
print( '@end selfAdd:' ,ra )
rs = sess.run(selfSub)
print('@end selfSub:' , rs )
print( 'b:' , rb )
sefAddition 是對變量selfAdd進行自加的操作,selfSubtraction是對變量selfSub進行自減的操作. "待解釋語句1"建立了一個trainOp,它本身不做什麼,但是它會定義預處理動作, 就是control_dependencies調用時實參所指定的 selfAddition. 若想除定義預處理動作之外,還想再加其他動作,可按照"待解釋語句2"的方法,在列表裏增加動作.這樣在"待解釋語句3"裏每次動執行 selfAddition和selfSubtraction.
必須在no_op之後再加動作,似乎和一般理解不一樣,不知道是不是哪裏有不對的地方,下面是我的測試代碼和輸出:
import tensorflow as tf
a = tf.Variable( 2 )
selfAdd = tf.Variable( 0 )
selfAddition = tf.assign_add( selfAdd , 3 )
selfSub = tf.Variable( 0 )
selfSubtraction = tf.assign_sub( selfSub , 2 )
b = tf.multiply( a , selfAdd )
with tf.control_dependencies( [ selfAddition ] ) :
train_op = selfSubtraction
print( train_op )
with tf.Session() as sess :
init = tf.global_variables_initializer()
sess.run( init )
for i in range( 20 ) :
sess.run( train_op )
print( "selfAdd:" , sess.run( selfAdd ) )
ra = sess.run(selfAdd )
rb = sess.run(b )
print( '@end selfAdd:' ,ra )
rs = sess.run(selfSub)
print('@end selfSub:' , rs )
print( 'b:' , rb )
Tensor("AssignSub:0", shape=(), dtype=int32_ref)
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
selfAdd: 0
@end selfAdd: 0
@end selfSub: -40
b: 0
參考:
https://www.cnblogs.com/qjoanven/p/7736025.html