I am writing an application to show statistics on lighting conditions, as seen on the iphone camera. I take an image every second and do the calculations on it.
To capture an image, I use the following method:
-(void) captureNow
{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in captureManager.stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
[captureManager.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
latestImage = [[UIImage alloc] initWithData:imageData];
}];
}
However, the method captureStillImageAsynhronously....makes the shutter sound play on the phone, which is not suitable for my application, since it will capture images constantly.
I read that it is impossible to turn off this sound effect. Instead, I want to capture frames from the video input for the phone:
AVCaptureDeviceInput *newVideoInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self backFacingCamera] error:nil];
and hopefully turn them into objects UIImage.
How can i achieve this? I don't know how much AVFoundation works - I downloaded some sample code and modified it for my own purposes.