If you can choose a neural network so that the number of nodes in each layer is the same, and the weight matrix is nonsingular, and the transfer function is reversible (for example, leaky relu), then the function will be reversible.
Such a neural network is simply the composition of matrix multiplication, the addition of bias and transfer function. To invert, you just need to apply the converse to each operation in the reverse order. That is, take the result, apply the inverse transfer function, multiply it by the inverse of the last weight matrix, minus the offset, apply the inverse transfer function, multiply it by the inverse of the second to the last weight matrix, etc. etc..
zenna source share