I have a Unity / iOS application that captures a user's photo and displays it in a 3D environment. Now I would like to use CIFaceFeature to search for eye positions, which requires access to the national (Objective-C) level. My stream looks like this:
Unity - WebCamTexture (encode and send image to native - , this is SLOW )
Obj-C โ CIFaceFeature (find eye coordinates)
Unity โ Eye Position Display
I have a working prototype, but it's slow because I capture the image in Unity (WebCamTexture) and then send it to Obj-C to detect FaceFeature. There seems to be a way to just ask my Obj-C class to "inspect the active camera". It should be much, much faster than encoding and transmitting the image.
So my question is, in a nutshell:
- Is it possible to request in Obj-C 'is there a camera that is currently capturing?'
- If so, how do I โsnapshotโ the image from the current session?
Thanks!
MarcT
source share