One very simple approach is to use the CGImageRef decoding array, but this can only help to display the range (no gamma, etc.).
const CGFloat decode[6] = {blackPoint,whitePoint,blackPoint,whitePoint,blackPoint,whitePoint}; decodedImage = CGImageCreate(CGImageGetWidth(origImage), CGImageGetHeight(origImage), CGImageGetBitsPerComponent(origImage), CGImageGetBitsPerPixel(origImage), CGImageGetBytesPerRow(origImage), CGImageGetColorSpace(origImage), CGImageGetBitmapInfo(origImage), CGImageGetDataProvider(origImage), decode, YES, CGImageGetRenderingIntent(origImage) );
Where whitePoint is a float between 0.0 and 1.0, which determines which brightness will be displayed on pure white at the output, and blackPoint is also a float, which determines what brightness is displayed on pure black.
The elements of the decoding array correspond to the components of the color space, so this code will only work for RBG images. You can set the components to different white and black values ββto create simple color grading.
You can calculate whitePoint and blackPoint with the following function (without color correction):
void CalculateAutocorretionValues(CGImageRef image, CGFloat *whitePoint, CGFloat *blackPoint) { UInt8* imageData = malloc(100 * 100 * 4); CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); CGContextRef ctx = CGBitmapContextCreate(imageData, 100, 100, 8, 4 * 100, colorSpace, kCGImageAlphaNoneSkipLast); CGColorSpaceRelease(colorSpace); CGContextDrawImage(ctx, CGRectMake(0, 0, 100, 100), image); int histogramm[256]; bzero(histogramm, 256 * sizeof(int)); for (int i = 0; i < 100 * 100 * 4; i += 4) { UInt8 value = (imageData[i] + imageData[i+1] + imageData[i+2]) / 3; histogramm[value]++; } CGContextRelease(ctx); free(imageData); int black = 0; int counter = 0;