ISight Capture Using AVFoundation on Mac

I previously used this code to capture a single image from a Mac iSight camera using QTKit:

- (NSError*)takePicture { BOOL success; NSError* error; captureSession = [QTCaptureSession new]; QTCaptureDevice* device = [QTCaptureDevice defaultInputDeviceWithMediaType: QTMediaTypeVideo]; success = [device open: &error]; if (!success) { return error; } QTCaptureDeviceInput* captureDeviceInput = [[QTCaptureDeviceInput alloc] initWithDevice: device]; success = [captureSession addInput: captureDeviceInput error: &error]; if (!success) { return error; } QTCaptureDecompressedVideoOutput* captureVideoOutput = [QTCaptureDecompressedVideoOutput new]; [captureVideoOutput setDelegate: self]; success = [captureSession addOutput: captureVideoOutput error: &error]; if (!success) { return error; } [captureSession startRunning]; return nil; } - (void)captureOutput: (QTCaptureOutput*)captureOutput didOutputVideoFrame: (CVImageBufferRef)imageBuffer withSampleBuffer: (QTSampleBuffer*)sampleBuffer fromConnection: (QTCaptureConnection*)connection { CVBufferRetain(imageBuffer); if (imageBuffer) { [captureSession removeOutput: captureOutput]; [captureSession stopRunning]; NSCIImageRep* imageRep = [NSCIImageRep imageRepWithCIImage: [CIImage imageWithCVImageBuffer: imageBuffer]]; _result = [[NSImage alloc] initWithSize: [imageRep size]]; [_result addRepresentation: imageRep]; CVBufferRelease(imageBuffer); _done = YES; } } 

However, today I discovered that QTKit is deprecated, so we should now use AVFoundation. Can someone help me convert this code to the equivalent of AVFoundation? It seems that many methods have the same name, but at the same time much more, and I completely lost here ... Any help?

+7
objective-c cocoa avfoundation macos
source share
1 answer

Ok, I found a solution! Here he is:

 - (void)takePicture { NSError* error; AVCaptureDevice* device = [AVCaptureDevice defaultDeviceWithMediaType: AVMediaTypeVideo]; AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice: device error: &error]; if (!input) { _error = error; _done = YES; return; } AVCaptureStillImageOutput* output = [AVCaptureStillImageOutput new]; [output setOutputSettings: @{(id)kCVPixelBufferPixelFormatTypeKey: @(k32BGRAPixelFormat)}]; captureSession = [AVCaptureSession new]; captureSession.sessionPreset = AVCaptureSessionPresetPhoto; [captureSession addInput: input]; [captureSession addOutput: output]; [captureSession startRunning]; AVCaptureConnection* connection = [output connectionWithMediaType: AVMediaTypeVideo]; [output captureStillImageAsynchronouslyFromConnection: connection completionHandler: ^(CMSampleBufferRef sampleBuffer, NSError* error) { if (error) { _error = error; _result = nil; } else { CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); if (imageBuffer) { CVBufferRetain(imageBuffer); NSCIImageRep* imageRep = [NSCIImageRep imageRepWithCIImage: [CIImage imageWithCVImageBuffer: imageBuffer]]; _result = [[NSImage alloc] initWithSize: [imageRep size]]; [_result addRepresentation: imageRep]; CVBufferRelease(imageBuffer); } } _done = YES; }]; } 

Hope this helps someone who has any problems while doing the same task.

+8
source share

All Articles