The standard way to store variables in TensorFlow is to use the tf.train.Saver object. By default, it saves all the variables in your problem (i.e. tf.all_variables() results), but you can save the variables selectively by passing var_list optional argument to the tf.train.Saver constructor:
weights = { 'wc1_0': tf.Variable(tf.random_normal([5, 5, 3, 64])), 'wc1_1': tf.Variable(tf.random_normal([5, 5, 3, 64])) } biases = { 'bc1_0': tf.Variable(tf.constant(0.0, shape=[64])), 'bc1_1': tf.Variable(tf.constant(0.0, shape=[64])) }
Please note: if you pass the dictionary to the tf.train.Saver constructor (for example, the weights and / or biases dictionaries from your question), TensorFlow will use the dictionary key (for example, 'wc1_0' ) as the name for the corresponding variable in any control files the points that it creates or consumes.
By default, or if you pass a list of tf.Variable objects tf.Variable constructor, TensorFlow will use the tf.Variable.name property.
Passing a dictionary gives you the ability to share breakpoints between models that provide different Variable.name properties for each variable. This detail is only important if you want to use the created control points with a different model.
mrry source share