Adaptive Threshold CIKernel / CIFilter iOS

I researched everything to find a kernel that runs an adaptive threshold for iOS. Unfortunately, I do not understand the kernel language or its logic. Below I found a routine that executes a threshold value ( https://gist.github.com/xhruso00/a3f8a9c8ae7e33b8b23d )

static NSString * const kKernelSource = @"kernel vec4 thresholdKernel(sampler image)\n" "{\n" " float inputThreshold = 0.05;\n" " float pass = 1.0;\n" " float fail = 0.0;\n" " const vec4 vec_Y = vec4( 0.299, 0.587, 0.114, 0.0 );\n" " vec4 src = unpremultiply( sample(image, samplerCoord(image)) );\n" " float Y = dot( src, vec_Y );\n" " src.rgb = vec3( compare( Y - inputThreshold, fail, pass));\n" " return premultiply(src);\n" "}"; 

Is it possible to rewrite this into the core of the adaptive threshold? The image that I supply them is turned into B&W and is already blurry. Are there any resources that you could turn me to? I would like to stick with CoreImage as my whole stack is built around it.

Edit: the best example / link of what I am trying to achieve was implemented in GPUImage GPUImageAdaptiveThresholdFilter - https://github.com/BradLarson/GPUImage/blob/c5f0914152419437869c35e29858773b1a06083c/framework/Source/GPUmeImportFiveImport

+6
source share
2 answers

Simon Filter is the right approach to achieve the desired effect, however you need to change a couple of things.

First of all, change the order of imageLuma and thresholdLuma , since we want the black letters to remain black, and not vice versa. In addition, you must add a constant (I selected 0.01 ) to remove noise.

  var thresholdKernel = CIColorKernel(string: "kernel vec4 thresholdFilter(__sample image, __sample threshold)" + "{" + " float imageLuma = dot(image.rgb, vec3(0.2126, 0.7152, 0.0722));" + " float thresholdLuma = dot(threshold.rgb, vec3(0.2126, 0.7152, 0.0722));" + " return vec4(vec3(step(thresholdLuma, imageLuma+0.001)), 1);" "}" override var outputImage: CIImage! { guard let inputImage = inputImage, let thresholdKernel = thresholdKernel else { return nil } let blurred = inputImage.applyingFilter("CIBoxBlur", withInputParameters: [kCIInputRadiusKey: 5]) // block size let extent = inputImage.extent let arguments = [inputImage, blurred] return thresholdKernel.apply(withExtent: extent, arguments: arguments) } 

And this is what you get Only using Apple Core Image, without the need to install any external libraries :)

enter image description here

Of course, you can play around a bit with the constant and block size values.

+2
source

How it looks: I used CoreImage CIBoxBlur (although dedicated convolution filters may be faster) and passed the output of this to my existing threshold filter .

 class AdaptiveThresholdFilter: CIFilter { var inputImage : CIImage? var thresholdKernel = CIColorKernel(string: "kernel vec4 thresholdFilter(__sample image, __sample threshold)" + "{" + " float imageLuma = dot(image.rgb, vec3(0.2126, 0.7152, 0.0722));" + " float thresholdLuma = dot(threshold.rgb, vec3(0.2126, 0.7152, 0.0722));" + " return vec4(vec3(step(imageLuma, thresholdLuma)), 1.0);" + "}" ) override var outputImage: CIImage! { guard let inputImage = inputImage, thresholdKernel = thresholdKernel else { return nil } let blurred = inputImage.imageByApplyingFilter("CIBoxBlur", withInputParameters: [kCIInputRadiusKey: 9]) let extent = inputImage.extent let arguments = [inputImage, blurred] return thresholdKernel.applyWithExtent(extent, arguments: arguments) } } 

I found this image of a shaded page and using this code:

 let page = CIImage(image: UIImage(named: "son1.gif")!) let filter = AdaptiveThresholdFilter() filter.inputImage = page let final = filter.outputImage 

I got this result:

enter image description here

Hooray!

Simon

+4
source

All Articles