Spot axis detection

I am trying to automatically detect the rotation axis in 3d pointcloud.

In other words, if I took a small 3d pointcloud, selected one rotation axis and made several copies of points with different rotation angles, then I get a larger pointcloud.

Input to my algorithm is a larger pointcloud, and the desired result is the only axis of symmetry. And in the end, I'm going to calculate the correspondence between the points, which are rotations of each other.

The size of the larger pointcloud is about 100 thousand points, and the number of rotational copies created is unknown.

The rotation angles in my case have constant deltas, but do not necessarily cover 360 degrees. For example, I may have 0, 20, 40, 60. Or I may have 0, 90, 180, 270. But I will not have 0, 13, 78, 212 (or, if so, I do not care for him detection).

This seems like a computer vision problem, but it's hard for me to figure out how to accurately find the axis. The entrance will usually be very clean, close to the accuracy of the float.

I don't have the original smaller pointcloud that was rotated / copied to make a larger pointcloud. I know that the data is synthetic with very little noise (usually this is the output of another program).

We cannot easily calculate the possible number of points in a smaller cloud, because, rightly, the points along the axis are not duplicated, unfortunately. If we knew which points were along the axis, then we could find possible factors, but then we would already solve the problem.

-

Thank you all for your suggestions. It looks like my last algorithm will try to find clicks of matching points using the k-nn metric. Each click will give an axis. Then I can use RANSAC to match the axis with the results of all clicks.

+7
geometry computer-vision computational-geometry 3d
source share
9 answers

Well, the following approach may be useful, but it depends on your specific data. It is based on the assumption that the gap between neighboring positions is large enough (20 degrees, probably good), and a small cloud of points approximates the surface (the latter can be overcome). I suggest using a local function mapping (a popular technique in computer vision).

First, for each point in a large cloud, you must calculate local descriptors (for example, SIFT or SURF for images). The most popular for point clouds is the spin image:

Johnson, A., and Hebert, M. (1999). Using spin images for effective recognition of objects in cluttered 3 scenes. IEEE Transactions on Pattern Analysis and Machine Intelligence, 21 (5), 433-449. Citeseer. Retrieved from http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.23.8816&rep=rep1&type=pdf .

An extended modification is used here:

Endres, F., Plagemann, C., Stachniss, C., and Burgard, W. (2009). Uncontrolled detection of feature classes from range data using a hidden Dirichlet distribution. In robotics: science and systems. Seattle, USA

If it will be difficult to calculate, ask me how to reduce the dimension without losing significant power, I did it once.

Then you must match the descriptors, i.e. find the nearest neighbors for each of them in their high-dimensional space. If a small cloud has been rotated 3 times, there should be 3 good closest neighbors. However, due to cloudy self-intersections, matches will likely contain noise. You still need to find an axis that is well-suited for a large number of matches (although not for all). Here you can use some reliable fittings such as RANSAC (you have to do some math to determine the probability of finding the wrt axis of the matches found). Please note that it is different from the methods suggested by others. You should only put one row in place of the plane family, based on descriptors, not source points (RANSAC probably won't be able to place a plane with 4-5 correct points and 100K outliers).

Also note:

If you have a small scan that does not approximate the surface, you should come up with a different rotationally invariant descriptor, rather than rotate the images.

To calculate the normals and perform a search, you can check this library: http://graphics.cs.msu.ru/en/science/research/3dpoint/lidark (a major release is coming soon).

+2
source share

It is further assumed that there are 3 or more instances. Consider the convex hull of a large point cloud. Find two faces parallel to each other. The axis of rotation will be perpendicular to them. If you find more than one pair, just check each orientation.

Obviously, this will not work if the extreme points according to the axis are right on the axis, but in the general case this is very rare.

+2
source share

Select any point and find two other points that are equidistant from it. This should take O (n ^ 2) in the worst case, but heuristics can significantly reduce this. These three points uniquely define a circle. If there is a fourth point at the same distance from the first or third point, this greatly increases your confidence in the circle.

The center of the circle is a point on the axis, and the normal vector to the circle is the direction vector of the axis.

Given the axis, you can determine the angle between rotations and test your assumptions with some other points. If this is not correct, select another point and start.

Edit: By the way, it is obvious that this is not deterministic, but if your point data is clean, as you say, and you use good heuristics, this should be pretty accurate.

+1
source share

1) If you find a centroid C of a larger pointcloud, ISTM - the original axis of rotation should have passed through this point.

Nothing: I did not see the requirement that the rotation does not cover the full circle. For your example, 20.40.60 centroids will not be on the axis of rotation.

Maybe the following ref will help?

"Reconstruction of surfaces of rotation with partial sampling" http://portal.acm.org/citation.cfm?id=980064

0
source share

Some points :)

  • If we have 1 starting point never rotates, then there are an infinite number of axes.
  • If we have 1 starting point rotated 1 time, then there is also an infinite number of axes.
  • If we have 1 starting point and rotate it 2 times, we can find out the axis, because 3 points uniquely define a plane. Perpendicular to which through the point equidistant from each of the three points (1 initial + 2 rotated), is the axis.
  • Please note that 360 rotation does not make sense.

  • Select any point (P) from the cloud.
  • Track the locus (L) of point P (since P rotates around some axis, L should be a circle)
  • Perpendicular to the plane of the circle (L) passing through its center is the axis of rotation of the cloud.

I mean that the axis of rotation of a solid is the same as the axis of rotation of any individual particle in the body. We do not need to care about all the other points.

0
source share

Take a look at the methods used in stereo imaging to calculate the homography between two images - your problem with point cloud sets seems comparable with coincident points in several images of the same object / scene. It looks like you can apply the RANSAC algorithm to calculate the conversion between your point cloud sets.

0
source share

Crazy idea ...

If the same point rotates around the same axis several times, all points will lie in the same plane. Depending on your dataset, you might find this plane using the ransac method.

The axis of rotation will be perpendicular to this plane, and it is relatively easy to determine the location of the axis after determining its orientation.

0
source share

There are two things you should consider:

  • Spacebar angular point cloud.
  • Angle of rotation.

Now, if (rotation> span) the solution is simpler because you need to look for an additional template and be based on its occurrence, try matching a larger template.

in the case of (rotation <span), you will have to reconsider the edge effects, because inside the area (after the first rotation, until the last rotation) you will still have a symmetrical cloud of points with an angle of symmetry = angle of flight; as in the case above.

If you do not know in which category you will fall, it is safe to assume in the second.

As mentioned earlier, RANSAC is the best way to match patterns, as it takes less time and produces decent results. The only problem you have is the angle of the range estimate of the mini-point cloud during initialization. this assessment is difficult to make. Therefore, if you have enough processing power / time, I would suggest repeating it in 1 degree increments. starting at a modest 5 degrees, continuing to speak 45 degrees. As the results begin to converge, increase the accuracy of angular.

0
source share

Since the source point cloud is small, RANSAC might be the easiest solution:

  • Choose three dots in random order.
  • Calculate the axis of rotation for these points (line perpendicular to the circle and passing through the center)
  • Do other points agree?
  • If not, try again until the correct axis is found.

The probability of a correct estimate is 1 / ((n-1) (n-2)), where n is the number of points in the original cloud. Since each test can be performed very quickly, this can be a useful approach.

0
source share

All Articles