ValueError: Shape must be rank 0 but is rank 1 for ‘Adam/update_weight/ApplyAdam‘ (op: ‘ApplyAdam‘)

原代碼:

self.lr = tf.placeholder(shape=[1],dtype=tf.float32,name="learning_rate")
。。。
。。。
        
optimizer = tf.train.AdamOptimizer(learning_rate=self.lr)
self.trainops = optimizer.minimize(self.cost)                                                                       
    

報錯信息:

InvalidArgumentError: Shape must be rank 0 but is rank 1 for 'Adam/update_weight/ApplyAdam' (op: 'ApplyAdam') with input shapes: [39,2], [39,2], [39,2], [], [], [1], [], [], [], [39,2].
During handling of the above exception, another exception occurred:
ValueError                                Traceback (most recent call last)
<ipython-input-149-f2a8d1d94ea1> in __init__(self)
     29 
     30         optimizer = tf.train.AdamOptimizer(learning_rate=self.lr)
---> 31         self.trainops = optimizer.minimize(self.cost)

出錯原因:自定義的變量self.lr的shape爲[1],即vector(rank 1 tesnsor),而tf.train.AdamOptimizer中learning_rate只接受scalar(rank 0 tensor)。

修改爲:

self.lr = tf.placeholder(dtype=tf.float32,name="learning_rate")
。。。
。。。
        
optimizer = tf.train.AdamOptimizer(learning_rate=self.lr)
self.trainops = optimizer.minimize(self.cost)                                                                       
    

參考:https://stackoverflow.com/questions/45733581/tensorflow-optimizer-minimize-valueerror-shape-must-be-rank-0-but-is-rank

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章