tensorflow 獲取模型所有參數總和數量

參數數量:

np.sum([np.prod(v.get_shape().as_list()) for v in tf.trainable_variables()])

浮點運算量:

tf.contrib.tfprof.model_analyzer.print_model_analysis(tf.get_default_graph(), tfprof_options=tf.contrib.tfprof.model_analyzer.FLOAT_OPS_OPTIONS)



 

tensorflow 獲取模型所有參數總和數量

 

from functools import reduce
from operator import mul

def get_num_params():
    num_params = 0
    for variable in tf.trainable_variables():
        shape = variable.get_shape()
        num_params += reduce(mul, [dim.value for dim in shape], 1)
    return num_params

1 參數量的計算
該函數需要在訓練的函數中調用即可執行,可以得出該網絡執行的總參數。

def count():
    total_parameters = 0
    for variable in tf.trainable_variables():
        # shape is an array of tf.Dimension
        shape = variable.get_shape()
        variable_parameters = 1
        for dim in shape:
            variable_parameters *= dim.value
        total_parameters += variable_parameters
    return total_parameters

2 FLOPs的計算

def count_flops(graph):
    flops = tf.profiler.profile(graph, options=tf.profiler.ProfileOptionBuilder.float_operation())
    print('FLOPs: {}'.format(flops.total_float_ops))

這兩個計算參數和FLOPs的方法都必須要提前導入import tensorflow as tf頭文件,纔可以使用。
其中某些部分是參考:tensorflow參數及FLOPs估算
————————————————
版權聲明:本文爲CSDN博主「w_xiaomu」的原創文章,遵循 CC 4.0 BY-SA 版權協議,轉載請附上原文出處鏈接及本聲明。
原文鏈接:https://blog.csdn.net/w_xiaomu/article/details/95599310

本文列舉摘抄了七種方法,但是大同小異,得出的結果也都相同

def count1():
    total_parameters = 0
    for variable in tf.trainable_variables():
        # shape is an array of tf.Dimension
        shape = variable.get_shape()
        # print(shape)
        # print(len(shape))
        variable_parameters = 1
        for dim in shape:
            # print(dim)
            variable_parameters *= dim.value
        # print(variable_parameters)
        total_parameters += variable_parameters
    print(total_parameters)
def count2():
    print np.sum([np.prod(v.get_shape().as_list()) for v in tf.trainable_variables()])
  •  
def get_nb_params_shape(shape):
    '''
    Computes the total number of params for a given shap.
    Works for any number of shapes etc [D,F] or [W,H,C] computes D*F and W*H*C.
    '''
    nb_params = 1
    for dim in shape:
        nb_params = nb_params*int(dim)
    return nb_params

def count3():
    tot_nb_params = 0
    for trainable_variable in tf.trainable_variables():
        shape = trainable_variable.get_shape()  # e.g [D,F] or [W,H,C]
        current_nb_params = get_nb_params_shape(shape)
        tot_nb_params = tot_nb_params + current_nb_params
    print tot_nb_params
def count4():
    size = lambda v: reduce(lambda x, y: x * y, v.get_shape().as_list())
    n = sum(size(v) for v in tf.trainable_variables())
    # print "Model size: %dK" % (n / 1000,)
    print n
def count5():
    total_parameters = 0
    # iterating over all variables
    for variable in tf.trainable_variables():
        local_parameters = 1
        shape = variable.get_shape()  # getting shape of a variable
        for i in shape:
            local_parameters *= i.value  # mutiplying dimension values
        total_parameters += local_parameters
    print(total_parameters)
def count6():
    total_parameters = 0
    for variable in tf.trainable_variables():
        variable_parameters = 1
        for dim in variable.get_shape():
            variable_parameters *= dim.value
        total_parameters += variable_parameters

    print("Total number of trainable parameters: %d" % total_parameters)
def count7():
    from functools import reduce
    from operator import mul
    num_params = 0
    for variable in tf.trainable_variables():
        shape = variable.get_shape()
        num_params += reduce(mul, [dim.value for dim in shape], 1)
    print num_params

1.How to count total number of trainable parameters in a tensorflow model?

2.What is the best way to count the total number of parameters in a model in TensorFlow?
3.Number of CNN learnable parameters - Python / TensorFlow

4.tensorflow 獲取模型所有參數總和數量

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章