I am trying to create something that captures a video feed on a Mac in real time and records segmented mp4 fragments with pre-segmentation. It works for some cameras, but not for others.
The setting is as follows.
An AVCaptureVideoDataOutput has an AVCaptureVideoDataOutputSampleBufferDelegate .
dispatch_queue_t sampleQueue = dispatch_queue_create("samples", NULL); [videoOutput setSampleBufferDelegate: delegate queue: sampleQueue];
The output also participates in AVCaptureSession , which saves the file and displays a preview, and everything works fine.
In the delegate, I have AVAssetWriterInput . It is configured as follows:
NSDictionary *videoFormat = [NSDictionary dictionaryWithObjectsAndKeys: // Format options AVVideoCodecH264, AVVideoCodecKey,// h264 [NSNumber numberWithInt: width], AVVideoWidthKey, [NSNumber numberWithInt: height], AVVideoHeightKey, // Encoder options [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt: theQuality*1024], AVVideoAverageBitRateKey,// 256kbps [NSNumber numberWithInt: 30], AVVideoMaxKeyFrameIntervalKey,// write at least one keyframe every 30 frames nil], AVVideoCompressionPropertiesKey, nil], video = [[AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeVideo outputSettings: videoFormat] retain]; [video setExpectsMediaDataInRealTime: YES];
There is a way to write to a file with this delegate:
writer = [[AVAssetWriter assetWriterWithURL: url fileType: AVFileTypeMPEG4 error: &error] retain]; [writer setShouldOptimizeForNetworkUse: YES]; [writer addInput: video];
Now, inside the captureData callback method in the delegate, where I process the sampled data buffers, I do this:
if([video isReadyForMoreMediaData]) [video appendSampleBuffer: sampleBuffer];
Excellent! This works for my optional camera. Now I plug in the BlackMagic Intensity and use this.
In this line:
[video appendSampleBuffer: sampleBuffer]
I get this error:
*** -[AVAssetWriterInput appendSampleBuffer:] Input buffer must be in an uncompressed format when outputSettings is not nil
I played with the videoOutput settings, but to no avail. The docs seem to mention that this is something that cannot be compressed. Such things as:
// not all at the same time, of course. videoOutput.videoSettings = nil; videoOutput.videoSettings = [NSDictionary dictionaryWithObject:@"avc1" forKey:AVVideoCodecKey];
The best part is that if I turn off all coding material in the output, it will tell me that end-to-end coding is not supported. Sweet.
Fortunately, a search in this error message results in zero. If anyone can point me to a solution here, I would like to know what I'm doing wrong.