I have a vector $\vec{x}$ in 3D space that is unknown. I do know $\vec{p_1}$, $\vec{p_2}$, $\vec{p_3}$ which are orthogonal projection vectors of $\vec{x}$ onto lines $P_1$, $P_2$, $P_3$, all going through the origin and not parallel to each other. From $\vec{p_1}$,$\vec{p_2}$ and $\vec{p_3}$ I can calculate the unit vectors in the direction of the lines and with those I can calculate the linear transformation matrices $A_1$, $A_2$ and $A_3$ of the projections with $\vec{p_1} = (\vec{x} \vec{e_{p1}}) \vec{e_{p1}}$
So I have:
$\vec{p_1} = A_1 \vec{x}$
$\vec{p_2} = A_2 \vec{x}$
$\vec{p_3} = A_3 \vec{x}$
Since no single projection matrix A is invertible, how would I calculate $\vec{x}$? $\vec{x}$ is definitely uniquely determined. For 2D I can show that $\vec{p_1} + \vec{p_2} = (A_1+A_2) \vec{x}$ is solvable for $\vec{x}$, if $\vec{p_1}$ and $\vec{p_2}$ are not parallel. But the proof in 3D is much harder, so maybe there is an easier way.
An easy way is to assemble the matrices as $$\begin{bmatrix}A_1\\A_2\\A_3\end{bmatrix}x=\begin{bmatrix}p_1\\p_2\\p_3\end{bmatrix}$$ then use least squares method to find $x$. This gives the solution $$x=(A_1^TA_1+A_2^TA_2+A_3^TA_3)^{-1}(A_1^Tp_1+A_2^Tp_2+A_3^Tp_3)$$
If you're not familiar with that method, then suppose $$x=\alpha_1p_1+\alpha_2p_2+\alpha_3p_3$$ $$\therefore p_1=A_1 x=\alpha_1p_1+\alpha_2A_1p_2+\alpha_3 A_1p_3$$ where $A_1p_j$ can be calculated. Repeat for the other projections $p_i=A_ix=\sum_j\alpha_jA_ip_j$ to get three equations in $\alpha_i$.