In the tutorial below,
learning_rate = tf.train.exponential_decay(starter_learning_rate, global_step, 100000, 0.96, staircase=True)
starter_learning_rate can be changed after the desired eras by defining a function such as:
def initial_learning_rate(epoch): if (epoch >= 0) and (epoch < 100): return 0.1 if (epoch >= 100) and (epoch < 200): return 0.05 if (epoch >= 200) and (epoch < 500): return 0.001
And then you can initialize your starter_learning_rate inside a for loop (iterating over epochs) as follows:
for epoch in range(epochs):
Note
The global_step variable does not change:
decayed_learning_rate = starter_learning_rate * decay_rate ^ (global_step / decay_steps)
source share