Sharpness Detection

I am looking for a framework that helps determine the sharpness of a photo. I read this post that points to the methodology of this. But I would rather work with the library than get my hands dirty.

The Core Image Apple documentation says:

Core Image can analyze image quality and provide a set of filters with optimal settings for adjusting hues such as hue, contrast and tone color, as well as for correcting flash artifacts such as red eyes. He does all this with a single method call on your part.

How can I perform an "image quality analysis"? I would like to see sample code.

+4
source share
3 answers

We did this using the GPUimage framework like this (calculate brightness and sharpness): (here are some snippets that may help you)

-(BOOL) calculateBrightness:(UIImage *) image { float result = 0; int i = 0; for (int y = 0; y < image.size.height; y++) { for (int x = 0; x < image.size.width; x++) { UIColor *color = [self colorAt:image atX:x andY:y]; const CGFloat * colors = CGColorGetComponents(color.CGColor); float r = colors[0]; float g = colors[1]; float b = colors[2]; result += .299 * r + 0.587 * g + 0.114 * b; i++; } } float brightness = result / (float)i; NSLog(@"Image Brightness : %f",brightness); if (brightness > 0.8 || brightness < 0.3) { return NO; } return YES; 

}

 -(BOOL) calculateSharpness:(UIImage *) image { GPUImageCannyEdgeDetectionFilter *filter = [[GPUImageCannyEdgeDetectionFilter alloc] init]; BinaryImageDistanceTransform *binImagTrans = [[BinaryImageDistanceTransform alloc] init ]; NSArray *resultArray = [binImagTrans twoDimDistanceTransform:[self getBinaryImageAsArray:[filter imageByFilteringImage:image]]]; if (resultArray == nil) { return NO; } int sum = 0; for (int x = 0; x < resultArray.count; x++) { NSMutableArray *col = resultArray[x]; sum += (int)[col valueForKeyPath:@"@max.intValue"]; } // Values under analysis NSLog(@"Image Sharp : %i",sum); if (sum < 26250000) { // tested - bad sharpness is under ca. 26250000 return NO; } return YES; 

}

But it is very slow. It takes approx. 40 seconds for a single image from an iPad camera.

+1
source

Perhaps the best way to do this is by using a polarity coherence metric:

Baroncini, V., et al. “ Coherence of the polar edges: a quasi-dazzling metric for assessing video quality. ” EUSIPCO 2009, Glasgow (2009): 564-568.

It works just as well for images as it does for videos. This directly measures the sharpness of the ribs. If you apply a sharpening filter, you can compare the values ​​before and after, and if you overdo it with a sharp, the values ​​will start to fall again. This requires several convolutions using cores with complex values, as described in the article.

+2
source

I do not think Core Image will help you. You can use the auto enhance function to get a set of suggested filters and values. However, there is no clarity (edge ​​contrast), but only the overall contrast of the image. Full list here.

There is an Apple vDSP API that can perform fast Fourier transforms:

The vDSP API provides mathematical functions for applications such as speech, sound, audio and video processing, diagnostic medical imaging, radar signal processing, seismic analysis and scientific data processing.

You can use it to analyze your image.

For a conceptual overview, see Using Fourier Transforms and VDSP Tutorial Search. There is also Q&A here on the stack.

+1
source

All Articles