The difference between SAD and SSD

Can someone tell me what is the difference between SAD and SDD? I am talking about the sum of the absolute differences and the sum of the squares.

just in the SSD and get a measure of the difference in the square? Instead of upgrading to a second power, I can simply use the absolute value of the measure.

What is a square good for?

+4
source share
1 answer

A square is often used to strongly discriminate large differences. If you have a rather large error (difference), and you will square it, the result will be even greater. Therefore, the optimization method based on the quadratic value will "try" to get rid of the biggest differences (outliers) in the first place.

It is also known that square methods are better for a Gaussian distribution, and absolute methods are better for a Laplacian noise distribution (perturbation).

+5
source

All Articles