This is not a solution; just a more mathematical description of what you are trying to achieve (without judging whether to do it right):
Since you round off all numbers to x decimal places, we can treat these numbers as integers (just multiply them by 10 ^ x).
Now you are trying to solve the following problem:
Given the matrix
A11+Adj11 A12+Adj12 ... A1n+Adj1n A21+Adj21 A22+Adj22 ... A2n+Adj2n A31+Adj31 A32+Adj32 ... A3n+Adj3n ... ... ... ... Am1+Adjm1 Am2+Adjm2 ... Amn+Adjmn
Where A11..Amn are constant integers,
Find the integers Adj11 ... Adjmn
Minimum Amount (abs (Adjxy))
(or perhaps you prefer: Collapse the amount ((Adjxy) ^ 2)
Depending on the:
- for each row m: Adjm1+Adjm2+...+Adjmn = - (Am1+Am2+...+Amn) - for each col n: Adj1n+Adj2n+...+Adjmn = - (A1n+A2n+...+Amn)
This is integer programming with m * n variables and m + n constraints. The function you are trying to minimize is not linear.
I am afraid that this problem is far from trivial. I believe that you should post it at https://math.stackexchange.com/
source share