I am trying to create image capture for my iOS application, but I keep getting color distortion on the result of CGImage. Here is a camera preview, the correct colors.

Cola red, all is well.
When I run my snapshot code, I get the following:

Cola is blue ... where did it come from?
I tried to mess with some parameters, but just don't get any image at all. Here is my snapshot code:
int bitsPerComponent = 8; int bitsPerPixel = 32; int bytesPerRow = [cameraVideo bufRowBytes]; CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, [cameraVideo bufDataPtr], [cameraVideo bufWidth]*[cameraVideo bufHeight]*4, NULL); CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB(); CGBitmapInfo bitmapInfo = kCGImageAlphaNoneSkipLast; CGColorRenderingIntent renderingIntent = kCGRenderingIntentPerceptual; CGImageRef imageRef = CGImageCreate( [cameraVideo bufWidth], [cameraVideo bufHeight], bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent); CGColorSpaceRelease(colorSpaceRef);
I am on my way, so if anyone can understand what I'm doing wrong, let me know.
Fixed
Here is the final code:
if (cameraVideo.ARPixelFormat == kCVPixelFormatType_32ARGB) { bitmapInfo = kCGBitmapByteOrder32Big | kCGImageAlphaNoneSkipFirst; } else { bitmapInfo = kCGBitmapByteOrder32Little | kCGImageAlphaNoneSkipFirst; }
Nils munch
source share