Can I track objects in ARKit like in Vuforia?

I could not find any information if Apple ARKit supports 3D object tracking (or even image tracking), as Vuforia does. I do not want to place the 3D model anywhere in the world. Instead, I want to detect a specific 3D object and place AR objects in front and on top of this object.

A simple example: I want to track a specific toy car in the real world and add a spoiler on top of the AR scene.

Can someone provide me with information, is it possible or not?

+15
ios swift augmented-reality arkit vuforia ios12
source share
3 answers

Update for iOS 12: in "ARKit 2" (aka ARKit on iOS 12 or later) ...

  • Image detection extends to image tracking, so up to four images are not just detected once, they are updated live every frame, even if they are moving relative to world space. Thus, you can attach a recognizable 2D image to your toy and have virtual AR content, following the toy on the screen.

  • There is also an object of detection - during the development process you can use one Arkyt application to scan a 3D object in the real world and create a file "object". Then you can send this file to your application and use it to recognize this object in the user environment. This may be appropriate for your “toy car” ... but keep in mind that the 3D object recognition function is detection, not tracking: ARKit will not follow the toy car when it moves.

See the WWDC18 report on ARKit 2 for details.


Update for iOS 11.3: in "ARKit 1.5" (aka ARKit on iOS 11.3 or later), a new image detection function in ARKit has appeared. If you have a known image (for example, a poster, a playing card, or something similar), you can include it in your Xcode project and / or load it from another location in the form of ARReferenceImage and put the detectionImages session configuration array into the detectionImages array. Then, when ARKit finds these images in a user environment, it gives you ARImageAnchor objects telling you where they are.

Note that this is not exactly the same as the “marker-based AR” that you see in some other toolboxes — ARKit only finds the reference image once, it does not tell you how it moves over time. So it's good for "launching" AR content (for example, those promotions when you point your phone to the Star Wars poster in the store and the character leaves it), but not for, say, AR board games where virtual characters remain connected to game pieces.


Otherwise ...

You can access the camera image in each captured ARFrame, so if you have other software that can help with such tasks, you can use them with ARKit. For example, the Vision platform (also new in iOS 11) offers several standard blocks for such tasks - you can detect barcodes and find their four angles, and after manually identifying an area of ​​interest in the image, track its movement between frames.,

+28
source share

Note: this is definitely a hack, but it adds continuous image tracking to ARKit Unity. The same idea can be applied to the native library.

Download ARKit 1.5 beta https://bitbucket.org/Unity-Technologies/unity-arkit-plugin/branch/spring2018_update

In ARSessionNative.mm add this code block:

 extern "C" void SessionRemoveAllAnchors(void* nativeSession) { UnityARSession* session = (__bridge UnityARSession*)nativeSession; for (ARAnchor* a in session->_session.currentFrame.anchors) { [session->_session removeAnchor:a]; return; } } 

In UnityARSessionNativeInterface.cs add this code to SessionRemoveUserAnchor:

 private static extern void SessionRemoveAllAnchors (IntPtr nativeSession); 

And this is under RemoveUserAnchor:

 public void RemoveAllAnchors() { #if !UNITY_EDITOR SessionRemoveAllAnchors(m_NativeARSession); #endif } 

Then call this from Update or Coroutine:

 UnityARSessionNativeInterface.GetARSessionNativeInterface().RemoveAllAnchors (); 

When the anchor is removed, the image may be recognized again. It is not super smooth, but it definitely works.

Hope you found this helpful! Let me know if you need more help.

+3
source share

ARKit 2.0 for iOS 12 not only supports Camera Tracking but also:

  • 3D object tracking
  • Face tracking
  • Image tracking
  • Image detection
  • 3D object scan

ARKit 3.0 for iOS 13 supports even more advanced features:

  • Occlusion of people (real-time compositing RGBAZ)
  • Body tracking (aka Motion Capture)
  • Tracking multiple faces (up to 3 faces)
  • Front and rear camera tracking simultaneously
  • Joint sessions
+3
source share

All Articles