WPF: element search by path

I have not noted this question yet. The current accepted answer was automatically accepted due to the Bounty time limit.


Regarding this programming game that I am currently creating.

As you can see from the link above, I am currently creating a game in which custom programmable robots fight autonomously in the arena.


Now I need a way to determine if the robot detected another robot at a certain angle (depending on where the tower can stand):

alt text http://img21.imageshack.us/img21/7839/robotdetectionrg5.jpg

As you can see from the above image, I drew a certain point of view of the tanks, in which I now need to imitate in my game to check every point in it, to see if there is another robot in the field of view.

Bots are simply canvases that are constantly being translated to the Battle Arena (another canvas).

I know the heading of the tower (how it will currently be faced), and with this I need to find if there are any bots in its path (and the path should be defined as a “point of view”, shown in the image above in a red triangle, I hope the image simplifies what I'm trying to convey.

I hope that someone can lead me to the fact that mathematics is involved in achieving this problem.


[UPDATE]

I tried the calculations you told me, but it does not work properly, because, as you can see from the image, bot1 should not see Bot2. Here is an example:

alt text http://img12.imageshack.us/img12/7416/examplebattle2.png

In the above scenario, Bot 1 checks if it can see bot 2. The following is the data (according to Waylon Flinn's answer ):

angleOfSight = 0.69813170079773179 //in radians (40 degrees) orientation = 3.3 //Bot1 current heading (191 degrees) x1 = 518 //Bot1 Center X y1 = 277 //Bot1 Center Y x2 = 276 //Bot2 Center X y2 = 308 //Bot2 Center Y cx = x2 - x1 = 276 - 518 = -242 cy = y2 - y1 = 308 - 277 = 31 azimuth = Math.Atan2(cy, cx) = 3.0141873380511295 canHit = (azimuth < orientation + angleOfSight/2) && (azimuth > orientation - angleOfSight/2) = (3.0141873380511295 < 3.3 + 0.349065850398865895) && (3.0141873380511295 > 3.3 - 0.349065850398865895) = true 

According to the above calculations, Bot1 can see Bot2, but, as you can see from the image, this is not possible, since they are in different directions.

What am I doing wrong in the above calculations?

+6
math wpf 2d
source share
9 answers

the angle between the robots is arctan (x-distance, y-distance) (most platforms provide this 2-argument arctan, which adjusts the angle for you. Then you just need to check if this angle is less than some number away from the current header .

EDIT: sorry for the vague answer, but here are some of my notes about your update:

  • the coordinates would be in the Cartesian system, and not in the window system. This leads to some different numbers

    Orientation: 3.3 <- this is not true, bot1 looks as if it is oriented at about 4 or something like that. 191 degrees will be around 8:30 a position on the watch, which is almost directly indicated in tank 2. It is not surprising that the system returns "Visible"!

    azimuth: cy will be -31 (31 units below), not 31.

With these changes you should get the right results.

+3
source share

Calculate the relative angle and distance of each robot relative to the current. If the angle is within a certain threshold of the current heading and within the maximum viewing range, then he can see it.

The only difficult task is to handle the boundary case when the angle goes from 2pi radians to 0.

+1
source share

Something like this in your bot class (C # code):

 /// <summary> /// Check to see if another bot is visible from this bot point of view. /// </summary> /// <param name="other">The other bot to look for.</param> /// <returns>True iff <paramref name="other"/> is visible for this bot with the current turret angle.</returns> private bool Sees(Bot other) { // Get the actual angle of the tangent between the bots. var actualAngle = Math.Atan2(this.X - other.X, this.Y - other.Y) * 180/Math.PI + 360; // Compare that angle to a the turret angle +/- the field of vision. var minVisibleAngle = (actualAngle - (FOV_ANGLE / 2) + 360); var maxVisibleAngle = (actualAngle + (FOV_ANGLE / 2) + 360); if (this.TurretAngle >= minVisibleAngle && this.TurretAngle <= maxVisibleAngle) { return true; } return false; } 

Notes:

  • +360 is there to force any negative angles to their corresponding positive values ​​and to shift the boundary case of angle 0 into a slightly easier range test.
  • This may be feasible using only radian angles, but I think they are dirty and hard to read: /
  • See the Math.Atan2 documentation for more details .
  • I highly recommend learning the XNA Framework as it is tailored to the design of the game. However, it does not use WPF.

This suggests that:

  • no barriers to presentation
  • The bot class has properties X and Y
  • Properties X and Y are in the center of the bot.
  • The bot class is the TurretAngle property, which indicates the positive angle of the tower relative to the x axis, counterclockwise.
  • The bot class has a static constant angle called FOV_ANGLE, which indicates the field of view of the tower.

Disclaimer: This is not verified or even verified for compilation, adapt it if necessary.

+1
source share

A few suggestions after implementing something like this (a long time ago!):

The following assumes that you iterate over all the bots on the battlefield (not a particularly good practice, but quickly and easily get something working!)

1) It is much easier to check if a bot is in a range if it can now be seen in FOV, for example.

 int range = Math.sqrt( Math.abs(my.Location.X - bots.Location.X)^2 + Math.abs(my.Location.Y - bots.Location.Y)^2 ); if (range < maxRange) { // check for FOV } 

This ensures that it can potentially reduce the frequency of FOV checks and speed up the simulation start process. As a warning, you may have some randomness here to make it more interesting, so after a certain distance the probability of seeing is linearly proportional to the range of the bot.

2) In this article , apparently, there is material for calculating FOV.

3) As an AI graduate ... you didn’t try Neural Networks, you could train them to recognize if the robot is within range and the real target. That would deny any terribly complicated and confusing math! You can have a multi-layer perceptron [1] , [2] feed in the coordinates of the bots, and the goals cling and get the perfect solution for a fire / no fire at the end. WARNING: I feel obligated to tell you that this methodology is not the easiest task and can be terribly frustrating when it goes wrong. Because of the (simle) non-deterministic nature of this form of the algorithm, debugging can be a pain. In addition, you will need some form of training, either Back Propogation (with case studies) or Genetic Algorithm (another complex process for perfection)! Given the choice, I would use number 3, but it is not for everyone!

+1
source share

This will tell you if the center of canvas2 can fall into canvas1. If you want to allow for the width of canvas2, it gets a little trickier. In a nutshell, you will need to do two checks, one for each of the corresponding corners of canvas2, instead of one check in the center.

 /// assumming canvas1 is firing on canvas2 // positions of canvas1 and canvas2, respectively // (you're probably tracking these in your Tank objects) int x1, y1, x2, y2; // orientation of canvas1 (angle) // (you're probably tracking this in your Tank objects, too) double orientation; // angle available for firing // (ditto, Tank object) double angleOfSight; // vector from canvas1 to canvas2 int cx, cy; // angle of vector between canvas1 and canvas2 double azimuth; // can canvas1 hit the center of canvas2? bool canHit; // find the vector from canvas1 to canvas2 cx = x2 - x1; cy = y2 - y1; // calculate the angle of the vector azimuth = Math.Atan2(cy, cx); // correct for Atan range (-pi, pi) if(azimuth < 0) azimuth += 2*Math.PI; // determine if canvas1 can hit canvas2 // can eliminate the and (&&) with Math.Abs but this seems more instructive canHit = (azimuth < orientation + angleOfSight) && (azimuth > orientation - angleOfSight); 
+1
source share

It can be easily achieved using a concept in vector math called a point product.

http://en.wikipedia.org/wiki/Dot_product

It may seem intimidating, but it is not so bad. This is the most correct way to solve the problem with FOV, and the beauty is that the same math works regardless of whether you are dealing with 2D or 3D (which is when you know that it is right).

(NOTE: If something is unclear, just ask in the comments section and I will use the missing links.)

Steps:

1) You need two vectors, one is the direction vector of the main reservoir. The other vector that you need is derived from the position of the tank in question and the main tank.

For our discussion, suppose that the header vector for the main tank (ax, ay) and the vector between the main position of the tank and the target tank (bx, by). For example, if the main tank is located at (20, 30), and the tank is at (45, 62), then the vector b = (45-20, 62-30) = (25, 32).

Again, for discussion, suppose that the main tank header vector is (3.4).

The main goal here is to find the angle between these two vectors, and a point product can help you with this.

2) A point product is defined as

a * b = | a || b | owls (angle)

reads as (point product) b, since a and b are not numbers, they are vectors.

3) or expressed another way (after some algebraic manipulations):

angle = acos ((a * b) / | a || b |)

the angle is the angle between the two vectors a and b, so only this information can tell you whether one tank can see the other or not.

| | - the value of the vector a, according to the Pythagorean theorem, is simply sqrt (ax * ax + ay * ay), the same applies to | b |.

Now the question is, how do you know a * b (point product b) to find the angle.

4) Salvation comes. It turns out that this point product can also be expressed as follows:

a * b = ax * bx + ay * by

So, the angle = acos ((ax * bx + ay * by) / | a || b |)

If the angle is less than half your FOV, then the tank in question is in view. Otherwise, it is not.

So, using the examples above:

Based on our sample numbers:

a = (3,4) b = (25, 32)

| | = sqrt (3 * 3 + 4 * 4)

| b | = sqrt (25 * 25 + 32 * 32)

angle = acos ((20 * 25 + 30 * 32) / | a || b |

(Remember to convert the resulting angle to degrees or radians, if necessary, before comparing it with your FOV)

+1
source share

Considering both of your questions, I think that you can solve this problem using the provided mathematics, then you will have to solve many other problems related to collision detection, bullet shooting, etc. This is not the case to decide, especially if your bots are not square. I would recommend looking at physical engines - farseer on codeplex is a good example of WPF, but that makes it a larger project than the high school task.

The best advice I received for high marks is that I am doing something very simple, do not share something brilliant.

0
source share

Does your tower really have such a wide firing scheme? The path traversed by the bullet will be a straight line, and it will not become larger as it moves. You should have a simple vector in the direction of the tower, representing the destruction zone of the towers. Each tank will have a limited circle representing their vulnerable area. Then you can continue with ray tracing. Simple intersection of a ray / circle. See Section 3 of the Intersection of Linear and Circular Components in 2D .

0
source share

Your updated problem seems to come from different "zero" directions of orientation and azimuth : a orientation of 0 seems to mean "straight up", but azimuth 0 is "right to right."

0
source share

All Articles