Based on this example of yours,
v = tf.Variable(0) c = tf.constant(3) add = tf.add(v, c) update = tf.assign(v, add) mul = tf.mul(add, update) with tf.Session() as sess: sess.run(tf.initialize_all_variables()) res = sess.run([mul, mul]) print(res)
Output: [9, 9]
You get [9, 9] , and this is actually what we requested. Think of it this way:
At runtime, once mul is taken from the list, it searches for a definition of this and finds tf.mul(add, update) . Now it needs the value add , which leads to tf.add(v, c) . Thus, it connects to the value of v and c , gets the value add as 3.
Ok, now we need the update value, which is defined as tf.assign(v, add) . We have values ββlike add (which he calculated just now as 3) and v . Thus, it updates the value of v to 3, which is also the value for update .
Now it has values ββfor add and update , which are equal to 3. Thus, multiplication gives 9 in mul .
Based on the result we get, I think, for the next element (operation) in the list, it simply returns the mul value just calculated. I'm not sure if this repeats the steps or simply returns the same (cached?) Value that he just calculated for mul , realizing that we have a result or these operations happen in parallel (for each element in the list). Maybe @mrry or @YaroslavBulatov can comment on this part?
Quote from @mrry comment:
When you call sess.run([x, y, z]) once, TensorFlow executes each op that these tensors depend on only once (if there is no tf.while_loop() on your graph). If the tensor appears twice in the list (for example, mul in your example), TensorFlow will execute it once and return two copies of the result. To complete the assignment more than once, you must either call sess.run() several times, or use tf.while_loop() to put the loop in the chart.