I am looking for inverse gradients by decomposing singular values for regularization purposes. PyTorch does not currently support backpropagation through singular decomposition.
I know that I could write my own custom function that works with a variable; accepts its .data tender, applies torch.svd to it, wraps the variable around its singular values and returns it in a direct pass, and in the reverse pass applies the corresponding Jacobian matrix to incoming gradients.
However, I was wondering if there was a more elegant (and possibly faster) solution where I could rewrite "Type Variable does not implement the method without state svd" Error directly, call Lapack, etc.?
If someone can direct me through the appropriate steps and source files that I should pay attention to, I would be very grateful. I believe that these steps are similarly applicable to other linear algebra operations that currently do not have an associated inverse method.
Many thanks
source
share