The best way to do this is with an AVCaptureSession object. I am doing exactly what you are talking about in my free "Live Effects Cam" application.
There are several code examples on the Internet that will help you implement this. Here is an example code snippet that might help:
- (void) activateCameraFeed { videoSettings = nil; #if USE_32BGRA pixelFormatCode = [[NSNumber alloc] initWithUnsignedInt:(unsigned int)kCVPixelFormatType_32BGRA]; pixelFormatKey = [[NSString alloc] initWithString:(NSString *)kCVPixelBufferPixelFormatTypeKey]; videoSettings = [[NSDictionary alloc] initWithObjectsAndKeys:pixelFormatCode, pixelFormatKey, nil]; #endif videoDataOutputQueue = dispatch_queue_create("com.jellyfilledstudios.ImageCaptureQueue", NULL); captureVideoOutput = [[AVCaptureVideoDataOutput alloc] init]; [captureVideoOutput setAlwaysDiscardsLateVideoFrames:YES]; [captureVideoOutput setSampleBufferDelegate:self queue:videoDataOutputQueue]; [captureVideoOutput setVideoSettings:videoSettings]; [captureVideoOutput setMinFrameDuration:kCMTimeZero]; dispatch_release(videoDataOutputQueue);
source share