How to change pixel color on the fly in iPhone camera preview window?

I use UIImagePickerController to take photos on iPhone. I would like to customize the photo on the fly, it looks like I can use the UIImagePickerController to customize the shape of the photo on the fly, but I cannot find a way to change the color on the fly. For example, change the entire color to black / white.

Thanks.

+4
source share
3 answers

The best way to do this is with an AVCaptureSession object. I am doing exactly what you are talking about in my free "Live Effects Cam" application.

There are several code examples on the Internet that will help you implement this. Here is an example code snippet that might help:

- (void) activateCameraFeed { videoSettings = nil; #if USE_32BGRA pixelFormatCode = [[NSNumber alloc] initWithUnsignedInt:(unsigned int)kCVPixelFormatType_32BGRA]; pixelFormatKey = [[NSString alloc] initWithString:(NSString *)kCVPixelBufferPixelFormatTypeKey]; videoSettings = [[NSDictionary alloc] initWithObjectsAndKeys:pixelFormatCode, pixelFormatKey, nil]; #endif videoDataOutputQueue = dispatch_queue_create("com.jellyfilledstudios.ImageCaptureQueue", NULL); captureVideoOutput = [[AVCaptureVideoDataOutput alloc] init]; [captureVideoOutput setAlwaysDiscardsLateVideoFrames:YES]; [captureVideoOutput setSampleBufferDelegate:self queue:videoDataOutputQueue]; [captureVideoOutput setVideoSettings:videoSettings]; [captureVideoOutput setMinFrameDuration:kCMTimeZero]; dispatch_release(videoDataOutputQueue); // AVCaptureVideoDataOutput uses dispatch_retain() & dispatch_release() so we can dispatch_release() our reference now if ( useFrontCamera ) { currentCameraDeviceIndex = frontCameraDeviceIndex; cameraImageOrientation = UIImageOrientationLeftMirrored; } else { currentCameraDeviceIndex = backCameraDeviceIndex; cameraImageOrientation = UIImageOrientationRight; } selectedCamera = [[AVCaptureDevice devices] objectAtIndex:(NSUInteger)currentCameraDeviceIndex]; captureVideoInput = [AVCaptureDeviceInput deviceInputWithDevice:selectedCamera error:nil]; captureSession = [[AVCaptureSession alloc] init]; [captureSession beginConfiguration]; [self setCaptureConfiguration]; [captureSession addInput:captureVideoInput]; [captureSession addOutput:captureVideoOutput]; [captureSession commitConfiguration]; [captureSession startRunning]; } // AVCaptureVideoDataOutputSampleBufferDelegate // AVCaptureAudioDataOutputSampleBufferDelegate // - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init]; if ( captureOutput==captureVideoOutput ) { [self performImageCaptureFrom:sampleBuffer fromConnection:connection]; } [pool drain]; } - (void) performImageCaptureFrom:(CMSampleBufferRef)sampleBuffer { CVImageBufferRef imageBuffer; if ( CMSampleBufferGetNumSamples(sampleBuffer) != 1 ) return; if ( !CMSampleBufferIsValid(sampleBuffer) ) return; if ( !CMSampleBufferDataIsReady(sampleBuffer) ) return; imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); if ( CVPixelBufferGetPixelFormatType(imageBuffer) != kCVPixelFormatType_32BGRA ) return; CVPixelBufferLockBaseAddress(imageBuffer,0); uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer); size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); size_t width = CVPixelBufferGetWidth(imageBuffer); size_t height = CVPixelBufferGetHeight(imageBuffer); int bufferSize = bytesPerRow * height; uint8_t *tempAddress = malloc( bufferSize ); memcpy( tempAddress, baseAddress, bytesPerRow * height ); baseAddress = tempAddress; // // Apply affects to the pixels stored in (uint32_t *)baseAddress // // // example: grayScale( (uint32_t *)baseAddress, width, height ); // example: sepia( (uint32_t *)baseAddress, width, height ); // CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); CGContextRef newContext = nil; if ( cameraDeviceSetting != CameraDeviceSetting640x480 ) // not an iPhone4 or iTouch 5th gen newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaNoneSkipFirst); else newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); CGImageRef newImage = CGBitmapContextCreateImage( newContext ); CGColorSpaceRelease( colorSpace ); CGContextRelease( newContext ); free( tempAddress ); CVPixelBufferUnlockBaseAddress(imageBuffer,0); if ( newImage == nil ) { return; } // To be able to display the CGImageRef newImage in your UI you will need to do it like this // because you are running on a different thread here… // [self performSelectorOnMainThread:@selector(newCameraImageNotification:) withObject:(id)newImage waitUntilDone:YES]; } 
+4
source

You can overlay the image on the image and change the blend mode to match the black / white effect.

Check out Apple's QuartzDemo , specifically in this demo of Blending Modes

+1
source

Another way to do this is to convert each frame using AVFoundation . I don’t have much experience with this, but the WWDC2010 video session “Session 409 - Using the Camera with AVFoundation” and its sample projects should go a long way to help you solve your problem.

This, of course, if you are using iOS4 classes.

+1
source

All Articles