CMSampleBufferRef for bitmap?

I play with the AVScreenShack example from the Apple website (Xcode project), which captures the desktop and displays the capture in the window in quasi-real mode.

I modified the project a bit and inserted this line of code:

-(void)captureOutput:(AVCaptureOutput*) captureOutput didOutputSampleBuffer:(CMSampleBufferRef) sampleBuffer fromConnection:(AVCaptureConnection*) connection { ... } 

My question is: How do I convert an instance of CMSampleBufferRef to CGImageRef?

Thanks.

+1
objective-c xcode avfoundation macos
source share
1 answer

Here's how you can create a UIImage from a CMSampleBufferRef. This worked for me when responding to captureStillImageAsynchronouslyFromConnection:completionHandler: to AVCaptureStillImageOutput .

 // CMSampleBufferRef imageDataSampleBuffer; CMBlockBufferRef buff = CMSampleBufferGetDataBuffer(imageDataSampleBuffer); size_t len = CMBlockBufferGetDataLength(buff); char * data = NULL; CMBlockBufferGetDataPointer(buff, 0, NULL, &len, &data); NSData * d = [[NSData alloc] initWithBytes:data length:len]; UIImage * img = [[UIImage alloc] initWithData:d]; 

It looks like the data coming out of CMBlockBufferGetDataPointer is JPEG data.

UPDATE: To fully answer your question, you can call CGImage on img from my code to get CGImageRef .

+2
source share

All Articles