It is true that a variable can be used anywhere a tensor can exist, but the key differences between them are that the variable maintains its state for several calls to execute (), and the value of the variable can be updated by back propagating (it can also stored, restored, etc. according to the documentation).
These differences mean that you should think of a variable as your model of learning parameters (for example, weights and displacements of a neural network), while you can think of a tensor as a representation of the data supplied to your model and intermediate representations of this data when passing through your model.
Avishkar bhoopchand
source share