Swift adapted AVCaptureVideoDataOutputSampleBufferDelegate, but captureOutput will never be called

I am trying to capture video frames from a camera and display the image stream on UIImageViewin real time. I tried to adapt AVCaptureVideoDataOutputSampleBufferDelegatefor mine viewcontroller. I also implemented captureOutput, but captureOutputnever called.

Here is my code:

import UIKit
import AVFoundation

class ViewController: UIViewController, AVCaptureVideoDataOutputSampleBufferDelegate {

    @IBOutlet weak var imageView: UIImageView!
    @IBOutlet var cameraView: UIView!

    var selectedImage :UIImage!

    let captureSession = AVCaptureSession()
    var captureDevice : AVCaptureDevice?
    var videoCaptureOutput : AVCaptureVideoDataOutput!

     override func viewDidLoad() {
        super.viewDidLoad()

        captureSession.sessionPreset = AVCaptureSessionPresetLow
        self.captureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
        if(captureDevice != nil){
            beginSession()
        }
    }


    func beginSession() {

        dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), { 
            self.videoCaptureOutput = AVCaptureVideoDataOutput()
            self.videoCaptureOutput.videoSettings = [kCVPixelBufferPixelFormatTypeKey:kCVPixelFormatType_32BGRA]
            self.videoCaptureOutput.alwaysDiscardsLateVideoFrames = true

            self.captureSession.addOutput(self.videoCaptureOutput)

            var err : NSError? = nil
            self.captureSession.addInput(AVCaptureDeviceInput(device: self.captureDevice, error: &err))
            self.captureSession.sessionPreset = AVCaptureSessionPresetPhoto
            if err != nil {
                println("error: \(err?.localizedDescription)")
            }
            var previewLayer = AVCaptureVideoPreviewLayer(session: self.captureSession)
            previewLayer?.frame = self.cameraView.layer.bounds
            previewLayer?.videoGravity = AVLayerVideoGravityResizeAspectFill
            dispatch_async(dispatch_get_main_queue(), { // 2
                // 3
                self.cameraView.layer.addSublayer(previewLayer)
                self.captureSession.startRunning()

            });
        });
    }

    func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {

        println("capture delegation called")

        var imageProcessor = ImageProcessor()
        imageView.image = imageProcessor.imageFromSampleBuffer(sampleBuffer)
    }
}

As you can see, I am trying to process the image and display the capture frame in real time on imageview, suppose mine ImageProcessor()works fine.

Any help would be greatly appreciated, thanks.

+4
source share
1 answer

, videoCaptureOutput. videoCaptureOutput setSampleBufferDelegate(self ,queue : queue ), - ,

+5

All Articles