Tensorflow `set_random_seed` not working

Calling tf.set_random_seed(SEED) does not affect what I can say ...

For example, running the code below several times inside the IPython laptop each time produces a different output:

 import tensorflow as tf tf.set_random_seed(42) sess = tf.InteractiveSession() a = tf.constant([1, 2, 3, 4, 5]) tf.initialize_all_variables().run() a_shuf = tf.random_shuffle(a) print(a.eval()) print(a_shuf.eval()) sess.close() 

If I set the seed explicitly: a_shuf = tf.random_shuffle(a, seed=42) , the result will be the same after each run. But why should I install the seed if I already call tf.set_random_seed(42) ?


Equivalent code using numpy just works:

 import numpy as np np.random.seed(42) a = [1,2,3,4,5] np.random.shuffle(a) print(a) 
+13
source share
1 answer

This sets only a random graph-level seed. If you execute this snippet several times in a row, the graph will change and the two shuffle operators will get different samples of the operation level. Details are described in doc string for set_random_seed

To get deterministic a_shuf , you can either

  • Call tf.reset_default_graph() between calls or
  • Set the initial value of the shuffle a_shuf = tf.random_shuffle(a, seed=42) : a_shuf = tf.random_shuffle(a, seed=42)
+10
source

All Articles