Image sharpness accuracy

Is there any solid metric for image sharpness or blur? I have many images with various saturation parameters and captured from different optical systems, and I pay attention to show the user something like a “quality” of focus. To get the most focused image, I use the metric obtained with the Sobel-Tenengrad operator (the sum of high-contrast pixels), but the problem is that for different objects there is a completely different metric range (depends on unknown parameters of the image intensity, optical system) - it took some metric where it could be said that the image has poor focus without comparison with the reference image, for example, it is a “bad” or “good” focused image.

+8
algorithm image metric
source share
4 answers

You can calculate the accutance of the image by calculating the average value of the gradient filter .

Link to this https://stackoverflow.com/a/312625/16/161365/ ... for a similar question.

+1
source share

Autofocus is an interesting problem in itself, so evaluating sharpness from arbitrary images is another level of difficulty.

For sharpening, I suggest this article from Cornell. Their conclusion was that the variance metric provided the best estimate for a given image. And it doesn’t hurt that it is really easy to calculate!

To create a consistent metric on different images, you need a normalization method. The metric can be in units of variance per pixel. You could take advantage of the fact that a lack of focus provides an upper bound for dispersion, so look for clustering at the maximum rate of local dispersion.

0
source share
0
source share

You need a sharpness metric without a link, for example:

0
source share

All Articles