I'm going crazy about it - I looked everywhere and tried something and everything that I can think of.
We’ll make an iPhone app that uses AVFoundation — specifically, AVCapture — to capture video using an iPhone camera.
I need to have my own image, which is superimposed on the video stream included in the recording.
As long as I have an AVCapture session, you can display the channel, access the frame, save it as a UIImage and transfer the overlay image to it. Then convert this new UIImage to CVPixelBufferRef. annnd, to check that bufferRef is working, I converted it back to UIImage and it still displays the image completely.
The problem starts when I try to convert CVPixelBufferRef to CMSampleBufferRef to add it to the AVCaptureSessions resource. CMSampleBufferRef always returns NULL when I try to create it.
Here is the capture function (void) captureOutput
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { UIImage *botImage = [self imageFromSampleBuffer:sampleBuffer]; UIImage *wheel = [self imageFromView:wheelView]; UIImage *finalImage = [self overlaidImage:botImage :wheel]; //[previewImage setImage:finalImage]; <- works -- the image is being merged into one UIImage CVPixelBufferRef pixelBuffer = NULL; CGImageRef cgImage = CGImageCreateCopy(finalImage.CGImage); CFDataRef image = CGDataProviderCopyData(CGImageGetDataProvider(cgImage)); int status = CVPixelBufferCreateWithBytes(NULL, self.view.bounds.size.width, self.view.bounds.size.height, kCVPixelFormatType_32BGRA, (void*)CFDataGetBytePtr(image), CGImageGetBytesPerRow(cgImage), NULL, 0, NULL, &pixelBuffer); if(status == 0){ OSStatus result = 0; CMVideoFormatDescriptionRef videoInfo = NULL; result = CMVideoFormatDescriptionCreateForImageBuffer(NULL, pixelBuffer, &videoInfo); NSParameterAssert(result == 0 && videoInfo != NULL); CMSampleBufferRef myBuffer = NULL; result = CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault, pixelBuffer, true, NULL, NULL, videoInfo, NULL, &myBuffer); NSParameterAssert(result == 0 && myBuffer != NULL);//always null :S NSLog(@"Trying to append"); if (!CMSampleBufferDataIsReady(myBuffer)){ NSLog(@"sampleBuffer data is not ready"); return; } if (![assetWriterInput isReadyForMoreMediaData]){ NSLog(@"Not ready for data :("); return; } if (![assetWriterInput appendSampleBuffer:myBuffer]){ NSLog(@"Failed to append pixel buffer"); } } }
Another solution that I hear about all the time is to use AVAssetWriterInputPixelBufferAdaptor, which eliminates the need to make a messy wrapper CMSampleBufferRef. However, I have been browsing forums and developer docs by folded and apple developers and cannot find a clear description or an example of how to install this or how to use it. If anyone has a working example, you can show me or help me get rid of the aforementioned problem - we worked on this non-stop for a week, and I ended up.
Let me know if you need any other information.
Thanks in advance,
Michael