How to get the coordinates of a point in a coordinate system based on angle and distance

How to get the coordinates of a point in a coordinate system when all I have is the coordinates of the origin (x, y) and the angle from the origin to the point and the distance from the origin to the point?

+7
source share
4 answers

You use Math.cos , Math.sin for example:

 pointX = x + distance * Math.cos(angle) pointY = y + distance * Math.sin(angle) 

Note that Math.cos and Math.sin assume that the argument is specified in radians . If you have an angle in degrees, you can use Math.cos( Math.toRadians(angle) ) .

+22
source

If d is the distance and A is the angle, then the coordinates of the point will be

(x + d * Cos (A), y + d * Sin (A))

+3
source

If r is the distance from the origin, and a is the angle (in radians) between the x axis and the point, you can easily calculate the coordinates with a transformation from polar coordinates:

 x = r*cos(a) y = r*sin(a) 

(this assumes that the origin is placed at (0,0) , otherwise you must add the offset to the final result).

The opposite result is obtained by calculating modulo the vector (since the distance + angle is the vector) and arctangent, which can be calculated using the atan2 function.

 r = sqrt(x*2+y*2) a = atan2(y,x) 
+3
source
 px = x + r * cos(phi) py = y + r * sin(phi) 

where [px py] is the point you are looking for, [xy] is the "source", r is the distance, and phi is the angle to the target from the origin.

EDIT: http://en.wikipedia.org/wiki/Polar_coordinate_system This link, which was well published by Bart Kearse, may provide some background information.

+2
source

All Articles