How to convert from 2D points to 3D points on a plane

97 Views Asked by At

I have some 3D coplanar points. The plane is defined by normal vector and constant. I need to work with the points in 2D and then convert them back to 3D. In order to convert the points to 2D I made a quaternion to rotate the plane to the xy-plane and I can transform the points with that (simply discarding the z-coord). After modifying the points in 2D I want to convert them back to 3D and that's where I got stuck. I noticed that the same quaternion, if inverted, rotates the point back but I need to translate it too somehow, involving the plane's constant, but I don't know how to do that. Thanks!

1

There are 1 best solutions below

3
On BEST ANSWER

Attach a reference frame to the plane. You know the plane's normal $n$ and you know the constant $d$. The plane equation is $n \cdot r = d $ , where $r =(x,y,z)$.

Such a frame is not unique. The frame is defined by a point $P_1 = (x_1, y_1, z_1)$ on the plane (any point), and a rotation matrix $R = [u_1, u_2, n] $

To specify the rotation matrix $R$, you need to choose a unit vector $u_1$ such that

$ u_1 \cdot n = 0 $ and $u_1 \cdot u_1 = 1 $

Further set $u_2 = n \times u_1 $

Vector $n$ is also assumed to be a unit vector. If it is not then normalize it, and change the constant $d$ accordingly.

For example, if $n$ is given as $[1, 2, 3]$ then normalizing it we get

$ n = [1, 2 , 3] / \sqrt{14} $

$u_1$ can be choosen as follows

$ u_1 = [2, -1, 0 ] / \sqrt{ 5 } $

or

$u_1 = [3, 0, -1] / \sqrt{10} $

or

$u_1 = [0, 3, -2] / \sqrt{13} $

etc.

Once you've selected $u_1$, you can compute $u_2$ in a unique way as $u_2 = n \times u_1$.

Now the $2D$ points are generated from the $3D$ points as follows

$ (u_i, v_i, 0) = R^T \bigg( (x_i, y_i, z_i) - (x_1,y_1,z_1) \bigg) $

The third coordinate of the right hand side is always zero.

Now you modify the set of $2D$ points any way you like, then to convert them back into $3D$, use the following formual

$ (x_i, y_i, z_i) = (x_1, y_1, z_1) + R (u_i, v_i , 0 ) $