Iphone camera show focus rectangle

I am cloning an Apple camera app using AVCaptureSession based on the Apple AppCam sample application. The problem is that I do not see the focus rectangle on the video preview screen. I used the following code to adjust the focus, but the focus rectangle is not displayed.

AVCaptureDevice *device = [[self videoInput] device]; if ([device isFocusModeSupported:focusMode] && [device focusMode] != focusMode) { NSError *error; printf(" setFocusMode \n"); if ([device lockForConfiguration:&error]) { [device setFocusMode:focusMode]; [device unlockForConfiguration]; } else { id delegate = [self delegate]; if ([delegate respondsToSelector:@selector(acquiringDeviceLockFailedWithError:)]) { [delegate acquiringDeviceLockFailedWithError:error]; } } } 

When I use the UIImagePickerController, autofocus, focus focus are supported by default and can see the focus rectangle. There is no way to show the focus rectangle in the video preview layer using AVCaptureSession?

+8
iphone avcapturesession
source share
3 answers

Focus animation is a complete custom animation that you must create yourself. I currently have the same problem as you: I want to show the rectangle as feedback for the user after he clicked on the preview layer.

The first thing you want to do is implement focus snap, possibly where you trigger the preview layer:

 UITapGestureRecognizer *tapGR = [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(tapToFocus:)]; [tapGR setNumberOfTapsRequired:1]; [tapGR setNumberOfTouchesRequired:1]; [self.captureVideoPreviewView addGestureRecognizer:tapGR]; 

Now implement the snap to focus method itself:

 -(void)tapToFocus:(UITapGestureRecognizer *)singleTap{ CGPoint touchPoint = [singleTap locationInView:self.captureVideoPreviewView]; CGPoint convertedPoint = [self.captureVideoPreviewLayer captureDevicePointOfInterestForPoint:touchPoint]; AVCaptureDevice *currentDevice = currentInput.device; if([currentDevice isFocusPointOfInterestSupported] && [currentDevice isFocusModeSupported:AVCaptureFocusModeAutoFocus]){ NSError *error = nil; [currentDevice lockForConfiguration:&error]; if(!error){ [currentDevice setFocusPointOfInterest:convertedPoint]; [currentDevice setFocusMode:AVCaptureFocusModeAutoFocus]; [currentDevice unlockForConfiguration]; } } } 

The last thing that I have not yet implemented myself is to add a focusing animation to the preview level, or rather, to the view controller that holds the preview layer. I believe that this can be done in tapToFocus :. There you already have a touch point. Just add an animated image or other representation that has a touch position as its center. After the animation is finished, take an image.

+11
source share

Quick implementation

Gesture:

  private func focusGesture() -> UITapGestureRecognizer { let tapRec: UITapGestureRecognizer = UITapGestureRecognizer(target: self, action: #selector(kTapToFocus)) tapRec.cancelsTouchesInView = false tapRec.numberOfTapsRequired = 1 tapRec.numberOfTouchesRequired = 1 return tapRec } 

Act:

  private func tapToFocus(gesture : UITapGestureRecognizer) { let touchPoint:CGPoint = gesture.locationInView(self.previewView) let convertedPoint:CGPoint = previewLayer!.captureDevicePointOfInterestForPoint(touchPoint) let currentDevice:AVCaptureDevice = videoDeviceInput!.device if currentDevice.focusPointOfInterestSupported && currentDevice.isFocusModeSupported(AVCaptureFocusMode.AutoFocus){ do { try currentDevice.lockForConfiguration() currentDevice.focusPointOfInterest = convertedPoint currentDevice.focusMode = AVCaptureFocusMode.AutoFocus currentDevice.unlockForConfiguration() } catch { } } } 
+2
source share

swift3 implementation

 lazy var focusGesture: UITapGestureRecognizer = { let instance = UITapGestureRecognizer(target: self, action: #selector(tapToFocus(_:))) instance.cancelsTouchesInView = false instance.numberOfTapsRequired = 1 instance.numberOfTouchesRequired = 1 return instance }() func tapToFocus(_ gesture: UITapGestureRecognizer) { guard let previewLayer = previewLayer else { print("Expected a previewLayer") return } guard let device = device else { print("Expected a device") return } let touchPoint: CGPoint = gesture.location(in: cameraView) let convertedPoint: CGPoint = previewLayer.captureDevicePointOfInterest(for: touchPoint) if device.isFocusPointOfInterestSupported && device.isFocusModeSupported(AVCaptureFocusMode.autoFocus) { do { try device.lockForConfiguration() device.focusPointOfInterest = convertedPoint device.focusMode = AVCaptureFocusMode.autoFocus device.unlockForConfiguration() } catch { print("unable to focus") } } } 
0
source share

All Articles