AVFoundation: Add Text to CMSampleBufferRef Video Frame

I am creating an application using AVFoundation.

Just before I call [assetWriterInput appendSampleBuffer:sampleBuffer] at - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection method.

I manipulate the pixels in the sample buffer (using a pixel buffer to apply the effect).

But the client wants me to also insert text (timestamp and framecounter) in frames, but so far I have not found a way to do this.

I tried to convert the samplebuffer to an image, apply the text on the image and convert the image back to samplebuffer, but then

 CMSampleBufferDataIsReady(sampleBuffer) 

not executed.

Here are my UIImage category methods:

  + (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer { CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); CVPixelBufferLockBaseAddress(imageBuffer,0); uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer); size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); size_t width = CVPixelBufferGetWidth(imageBuffer); size_t height = CVPixelBufferGetHeight(imageBuffer); CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); CGImageRef newImage = CGBitmapContextCreateImage(newContext); CGContextRelease(newContext); CGColorSpaceRelease(colorSpace); UIImage *newUIImage = [UIImage imageWithCGImage:newImage]; CFRelease(newImage); return newUIImage; } 

and

  - (CMSampleBufferRef) cmSampleBuffer { CGImageRef image = self.CGImage; NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey, [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey, nil]; CVPixelBufferRef pxbuffer = NULL; CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, self.size.width, self.size.height, kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef) options, &pxbuffer); NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL); CVPixelBufferLockBaseAddress(pxbuffer, 0); void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer); NSParameterAssert(pxdata != NULL); CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB(); CGContextRef context = CGBitmapContextCreate(pxdata, self.size.width, self.size.height, 8, 4*self.size.width, rgbColorSpace, kCGImageAlphaNoneSkipFirst); NSParameterAssert(context); CGContextConcatCTM(context, CGAffineTransformMakeRotation(0)); CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), CGImageGetHeight(image)), image); CGColorSpaceRelease(rgbColorSpace); CGContextRelease(context); CVPixelBufferUnlockBaseAddress(pxbuffer, 0); CMVideoFormatDescriptionRef videoInfo = NULL; CMSampleBufferRef sampleBuffer = NULL; CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault, pxbuffer, true, NULL, NULL, videoInfo, NULL, &sampleBuffer); return sampleBuffer; } 

Any ideas?

EDIT:

I changed my code with Tony's answer. (Thanks!) This code works:

 CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); CVPixelBufferLockBaseAddress( pixelBuffer, 0 ); EAGLContext *eaglContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2]; CIContext *ciContext = [CIContext contextWithEAGLContext:eaglContext options:@{kCIContextWorkingColorSpace : [NSNull null]} ]; UIFont *font = [UIFont fontWithName:@"Helvetica" size:40]; NSDictionary *attributes = @{NSFontAttributeName: font, NSForegroundColorAttributeName: [UIColor lightTextColor]}; UIImage *img = [UIImage imageFromText:@"01 - 13/02/2014 15:18:21:654" withAttributes:attributes]; CIImage *filteredImage = [[CIImage alloc] initWithCGImage:img.CGImage]; [ciContext render:filteredImage toCVPixelBuffer:pixelBuffer bounds:[filteredImage extent] colorSpace:CGColorSpaceCreateDeviceRGB()]; CVPixelBufferUnlockBaseAddress(pixelBuffer, 0); 
+6
source share
1 answer

You must take the CIFunHouse sample from the apple, and you can use this api to draw directly to the buffer

-(void)render:(CIImage *)image toCVPixelBuffer:(CVPixelBufferRef)buffer bounds:(CGRect)r colorSpace:(CGColorSpaceRef)cs

You can download it here WWDC2013

Create context

 _eaglContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2]; _ciContext = [CIContext contextWithEAGLContext:_eaglContext options:@{kCIContextWorkingColorSpace : [NSNull null]} ]; 

Now draw the image

 CVPixelBufferRef renderedOutputPixelBuffer = NULL; OSStatus err = CVPixelBufferPoolCreatePixelBuffer(nil, self.pixelBufferAdaptor.pixelBufferPool, &renderedOutputPixelBuffer); [_ciContext render:filteredImage toCVPixelBuffer:renderedOutputPixelBuffer bounds:[filteredImage extent] 
+2
source

All Articles