CIAreaHistogram inputScale factor

I am creating an application that uses the CIAreaHistogram Core Image filter. I use an inputCount value (number of buckets) of 10 for testing and an inputScale value of 1.

I get CIImage for the histogram itself, which I then run through the custom kernel (see the end of the post) to set the alpha values ​​to 1 (since otherwise the alpha value from the histogram calculations is pre-multiplied) and then convert to NSBitmapImageRep .

Then I scan the rep rep buffer of the image and print the RGB values ​​(skipping the alpha values). However, when I do this, the sum of the R, G, and B values ​​of 10 does not have to contain up to 255.

For example, with a completely black image, I apply a histogram and then a custom kernel and get the following output:

 RGB: 255 255 255 RGB: 0 0 0 RGB: 0 0 0 RGB: 0 0 0 RGB: 0 0 0 RGB: 0 0 0 RGB: 0 0 0 RGB: 0 0 0 RGB: 0 0 0 RGB: 0 0 0 

This is the way I expect, since all the pixels are black, so everything is in the first bucket. However, if I run the same algorithm with a color image, I get the following:

 RGB: 98 76 81 RGB: 164 97 87 RGB: 136 161 69 RGB: 100 156 135 RGB: 80 85 185 RGB: 43 34 45 RGB: 31 19 8 RGB: 19 7 3 RGB: 12 5 2 RGB: 16 11 11 

Add values ​​for R, G and B - they won't add up to 255. This causes problems because I need to compare two of these histograms and my algorithm expects the sums to be between 0 and 255. I could obviously scale these values, but I want to avoid this extra step for performance reasons.

I noticed something interesting that could give some idea of ​​why this is happening. In my custom kernel, I just set the alpha value to 1. I tried the second core (see End of post), which sets all the pixels to red. Obviously, the green and blue values ​​are zero. However, I get this result when checking the values ​​from the resulting image:

 RGB: 255 43 25 

But I just set G and B to zero! This seems to be part of a problem that points to color management. But since I explicitly set the values ​​in the kernel, there is only one block of code where this can happen - conversion to NSBitmapImageRep from CIImage from the filter:

 NSBitmapImageRep *bitmapRep = [[NSBitmapImageRep alloc] initWithCIImage:kernelOutput]; unsigned char *buf = [bitmapRep bitmapData]; 

As soon as I set the pixels to RGB 255 0 0, then execute these lines, then read the buffer, the RGB values ​​are all 255 43 25. I also tried to set the color space of the original CGImageRef, on which the whole workflow is based on kCGColorSpaceGenericRGB , thinking that the profile colors may be tolerated, but to no avail.

Can someone tell me why the CIFilter core will behave this way and how can I solve it?

As mentioned earlier, here are copies of the CIFilter kernel functions that I use. Firstly, one that sets the alpha value to 1:

 kernel vec4 adjustHistogram(sampler src) { vec4 pix = sample(src, destCoord()); pix.a = 1.0; return pix; } 

And further, one that sets all the pixels to RGB 255 0 0 but ends 255 43 25 when it is converted to NSBitmapImageRep :

 kernel vec4 adjustHistogram(sampler src) { vec4 pix = sample(src, destCoord()); pix.r = 1.0; pix.g = 0.0; pix.b = 0.0; pix.a = 1.0; return pix; } 

Thanks in advance for your help.

+6
source share
1 answer

You only need one line of code to create and display a histogram when using a custom filter for the main image (or whenever you create a new CIImage object or replace an existing one):

 return [CIFilter filterWithName:@"CIHistogramDisplayFilter" keysAndValues:kCIInputImageKey, self.inputImage, @"inputHeight", @100.0, @"inputHighLimit", @1.0, @"inputLowLimit", @0.0, nil].outputImage; 
0
source

All Articles