I'm designing a device that measures distance using a laser and a camera. The device has the camera and laser separated by 100mm and they both point in the exact same direction. This is done so the camera can determine how far an object is depending on the distance between the center of the camera and the laser light is. So if the device is pointing towards a far object the light from the laser is closer to the center of the camera but on the contrary, if the object is very close, the distance between the laser light and the center is about 100mm. In my head it seems possible but how would I do this?
I was thinking of posting this to StackOverflow but I decided this was more of a math question rather than a code question.
Edit: I found a video of this method being used here.
See this Wikipedia page about the pinhole camera model.
Assuming that the central target point of the camera and the point $P$ projected by the laser beam lay on a plane parallel to the image plane, you have a situation like in the picture:
You want to compute $x_3$ based on $y_1$, the distance from the center on the image, $f$ the focal length, and $x_1$, the distance between the laser light and the camera aperture $O$:
$$x_3=\frac{f}{y_1}x_1$$
In practice, you don't have $y_1$ on the image sensor but a distance in pixels, so you need to compute a value of $f$ adapted to your image file and you can compute it through a calibration at a known distance $x_{3c}$:
$$f=x_{3c}\frac{y_{1c}}{x_1}$$
You will have errors due to the target not on a plane perfectly parallel to the image plane, $X3$ axis not perfectly parallel to the laser beam, and radial distortion.