From memory, this turns into a task of its own vector. The distance from the point to your plane is proportional to Ax + By + Cz + D - one of the ways to see this is to note that it is normal to the plane (A, B, C). Constant D is a pain in the neck, but I think you can get rid of it by redefining your variables to transfer it to a constant, so that everything has a value of 0. In this case, I think that the best suitable plane will go through the origin.
Then you find that you want to minimize SUM_i (X_i. A) ^ 2, where A is a 3-vector. Of course, you can make it arbitrarily small by multiplying all the components of A by some small scalar, so you want to minimize this topic by limiting it, for example, || A || ^ 2 = 1, which makes sense of proportionality, making A a unit vector. (X_i. A) ^ 2 = A '(X_i' X) A, so you want to minimize A '(SUM_i (X_i'X_i)) A So, I think you need a minimum eigenvector SUM_i X_i'X_i
One of the reasons why this is often not used in statistics is that your answer will change if you scale the units of any of the coordinate vectors without increasing the same unit sizes in other directions.
Think about it, you can see that it all worked out right at http://en.wikipedia.org/wiki/Total_least_squares
source share