Kinect 3D gesture recognition based on skeleton movements - What libraries exist?

What gesture recognition libraries (if any) exist for Kinect? Now I use OpenNI to record skeleton movements, but I'm not sure how to go from this to launch discrete actions.

My problem may be as simple as detecting a pose, but it can also be as complex as time-based movements (i.e., to detect when they move their arms in a circle) depending on how difficult it is. The examples I saw for posture detection were very ad-hoc - is it because the general algorithm is hard to do right?

+5
gesture-recognition kinect openni
source share
4 answers

The NITE library (on top of OpenNI) has classes for detecting napkins and other gestures, but I personally had problems using both the OpenNI core libraries and NITE in C # (I continue to work in AccessViolationExceptions). If you are writing managed code, XnVNITE.net.dll is what the swipe detection has. It is found in the PrimeSense / NITE folder after installing NITE.

If you can do without skeleton and user recognition, there is also the ManagedNite.dll library, which is the redundant library that comes with the PrimeSense NITE installation. ManagedNite.dll also has hand / gesture recognition, but no skeleton / user detection.

Otherwise, you, of course, can define your own gestures based on time. You should be able to detect whether a series of hand points moves in a straight line with this function:

static bool DetectSwipe(Point3D[] points) { int LineSize = 10; // number of points in the array to look at int MinXDelta = 300; // required horizontal distance int MaxYDelta = 100; // max mount of vertical variation float x1 = points[0].X; float y1 = points[0].Y; float x2 = points[last].X; float y2 = points[last].Y; if (Math.Abs(x1 - x2) < MinXDelta) return false; if (y1 - y2 > MaxYDelta) return false; for (int i = 1; i < LineSize - 2; i++) { if (Math.Abs((points[i].Y - y1)) > MaxYDelta) return false; float result = (y1 - y1) * points[i].X + (x2 - x1) * points[i].Y + (x1 * y2 - x2 * y1); if (result > Math.Abs(result)) { return false; } } return true; } 

You can reinforce this code to detect what is happening with the right or left wire. I also did not enable the calculation of time in my example above - you will need to look at the time of the first and last points and determine whether the scroll is completed within a certain period of time.

+5
source share

check this out: http://kinectrecognizer.codeplex.com/

supports 3D tracking and fine-tuning recognition .. should also be easy to reuse

+1
source share

Softkinetic looks promising, but the SDK is not yet available.

0
source share

I am working on a standalone skeleton detection code for kinect. http://code42tiger.blogspot.com

I plan to release it for free, but I still have a long way to go from perfection. I am wondering if your requirement is only manually tracking a position, you can write it yourself, without even using OpenNI or any other library. If you need some simple advice, read below.

1) Background removal (explained on my blog) 2) BLOB detection (to choose which person to track, also explained on the blog) 3) Manual tracking (now that you have only the user in the data, you can easily find a hand by viewing farthest point of the body). 4) Tracking the position of the hand to detect gestures. (some calculations that track the hand every few frames will give you motion geometry)

This should work (if not perfect) in 75% of cases. If the user is not trying to find an error in algo, it should work for ordinary users.

0
source share

All Articles