If you could calculate the angle occupied by the image of the target, then the distance to the target should be proportional to cot (i.e. 1/tan ) of this angle. You should find that the number of pixels in the image corresponds approximately to the corners, but I doubt that it is completely linear, especially close.
The behavior of your camera lens is likely to affect this measurement, so it will depend on your exact setting.
Why not measure the size of the target at several distances and build a scatter plot? Then you can pick up a curve for the data to get the function size-> distance for your specific system. If your camera is close to a “perfect” camera, you should find that this graph looks like cot , and you should find your a and b values ​​to match dist = a * cot (b * width) .
If you try this experiment, why not post the answers here so that others can take advantage?
[Edit: note on “ideal” cameras]
In order for the camera image to look “realistic” for us, the image must approximate the projection onto the plane located in front of the eye (since we view the camera images while holding the flat image in front of our eyes). Imagine that you are holding a sheet of a business card in front of your eye and draw silhouettes of objects on this paper. The second diagram on this page shows what I mean. You can describe a camera that provides this as an ideal camera.
Of course, in real life, cameras do not work through tracing papers, but with lenses. Very sophisticated lenses. Look at the lens diagram on this page . For various reasons that you could spend studying throughout your life, it’s very difficult to create a lens that works just like an example of tracing paper will work under any conditions. Start with this wiki page and read if you want to know more.
Thus, you are unlikely to be able to calculate the exact relationship between the pixel length and the distance: you have to measure it and set the curve.
source share