I am trying to work out how to calculate a distance between two points in an image, with the image being projected at a known angle in reference to the plane, ie. the image is projected as a trapezium on to the plane.
If the camera is parallel to the plane it's simple enough to calculate a scale factor of each pixel on the plane according to its distance from the source and the camera parameters, but how do I account for an arbitrary tilt around any axis of the camera?
I have come across homographic transformations in my research, but I'm not sure if that is suitable as I do not have any reference points in the image to match to. The known factors are the angle of the camera as x,y,z values, the perpendicular distance from the sensor to the plane, the dimensions of the camera sensor/pixel size, the focal length (infinite focus), and the position of any given point on the image as pixel coordinates.