Ios / iphone api photography mode

I am trying to capture some high resolution photos (AVCaptureSessionPresetPhoto) on iPhone 5s. I tried using the following code:

dispatch_semaphore_t sync = dispatch_semaphore_create(0); while( [self isBurstModeEnabled] == YES ) { [stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) { if (imageSampleBuffer != NULL) { NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer]; NSString *videoThumbPath = [NSString stringWithFormat:@"%@/img%d.png", burstFolderPath, index]; [imageData writeToFile:videoThumbPath atomically:YES]; if( 0 == index ) { [self NSLogPrint:[NSString stringWithFormat:@"Created photo at %@",videoThumbPath]]; } } dispatch_semaphore_signal(sync); }]; dispatch_semaphore_wait(sync, DISPATCH_TIME_FOREVER); } 

Using this code, I can get about 2 shots per second, in no way approaching the batch mode performance of the native camera application. What am I doing wrong? I also tried to use the code above without a semaphore, but in this case I had strange behavior, some photos were missing (img0.png img1.png img3.png will be present, but img2.png will be absent). Using the second method, the performance will be better, but still not at the level of the performance of the native application (in my tests, the camera application will be about 8.4 photos per second).

+6
source share
2 answers

captureStillImageAsynchronouslyFromConnection:completionHandler: I do not think that Apple uses for its batch mode.

Instead, Apple * captures full-resolution video frames (which is supported by 5th). Here's how:

AVCaptureDevice is set to activeFormat for full resolution of the sensor, then you capture and process 10 frames per second from AVCaptureVideoDataOutputSampleBufferDelegate captureOutput:didOutputSampleBuffer:fromConnection: causing a shutter sound for each frame capture.

For devices that do not support video, at full resolution by the size of the sensor, and / or if you want to support something, you will need a snapshot (low-resolution images or a mode with a slower batch mode) older than iOS 7.x.

Note that you cannot use multiple simultaneous use of captureStillImageAsynchronouslyFromConnection:completionHandler: without any extremely unexpected results. This is why you should call each iteration from the previous completionHandler (which, in essence, is what your semaphore does). In addition, you can switch from PNG as a file format for batch images - it saves very slowly and requires a large amount of system resources - 15 or 20 PNGs can cause you serious grief!

* This is likely to be done, because, of course, he can use a private API to achieve the same end result.

+14
source

Use this method for batch mode on iOS 8 and above:

 - (void)captureStillImageBracketAsynchronouslyFromConnection:(AVCaptureConnection *)connection withSettingsArray:(NSArray *)settings completionHandler:(void (^)(CMSampleBufferRef sampleBuffer, AVCaptureBracketedStillImageSettings *stillImageSettings, NSError *error))handler NS_AVAILABLE_IOS(8_0); 

Documentation

+1
source

All Articles