I am creating a raster image context using CGBitmapContextCreatewith an option kCGImageAlphaPremultipliedFirst.
I made a 5 x 5 test image with some primary colors (pure red, green, blue, white, black), some mixed colors (i.e. purple) in combination with some alpha options. Each time the alpha component is not 255, the color value is incorrect.
I found that I can recount the color when I do something like:
almostCorrectRed = wrongRed * (255 / alphaValue);
almostCorrectGreen = wrongGreen * (255 / alphaValue);
almostCorrectBlue = wrongBlue * (255 / alphaValue);
But the problem is that my calculations are sometimes disabled by 3 or even more. So, for example, I get a value of 242 instead of 245 for green, and I'm 100% sure that it should be exactly 245. Alpha - 128.
Then, for the same color with different alpha opacity in the PNG bitmap, I get alpha = 255 and green = 245, as it should be.
If alpha is 0, then red, green and blue are also equal to 0. Here, all data is lost, and I can not determine the color of the pixel.
How can I avoid or cancel this alltogether alpha pre-multiplication so that I can change the pixels in my image based on true RGB pixel values, as it was when the image was created in Photoshop? How to restore the original values for R, G, B and A?
Background information (maybe not needed for this question):
, : UIImage, , , , . . . ( , 255), , R, G, B , Alpha , . , . . Alpha , R G B .