How to get epoch num information from tf.train.string_input_producer

If you are reading files using string_input_producer, for example

filename_queue = tf.train.string_input_producer( files, num_epochs=num_epochs, shuffle=shuffle) 

how can I get information about the time during the training (I want to show this information during the training) I tried below

 run tf.get_default_graph().get_tensor_by_name('input_train/input_producer/limit_epochs/epochs:0') 

will always be the same as the ultimate era num.

 run tf.get_default_graph().get_tensor_by_name('input_train/input_producer/limit_epochs/CountUpTo:0') 

will add 1 each time ..

Both cannot get the right era during training.

Another thing, if you retrain from the existing model, can I get the existing information about the era?

+5
source share
1 answer

I think the correct approach here is to define the global_step variable that you pass to your optimizer (or you can increase it manually).

TensorFlow Mechanics 101 gives an example:

 global_step = tf.Variable(0, name='global_step', trainable=False) train_op = optimizer.minimize(loss, global_step=global_step) 

Now global_step will increase every time train_op run. Since you know the size of your dataset and batch size, you will know what era you are in.

When you save the model using tf.train.Saver() , the global_step variable global_step also be saved. When you restore your model, you can simply call global_step.eval() to return the value of the step where you left off.

Hope this helps!

+1
source

All Articles