Ok so I have a systems with $6$ inequations and $3$ variables, and a point that may or may not solve this system. To check whether this point solves the inequations is straightforward, my problem is when it does not solve the inequations to find the closest point that does solve the problem.
I will present an example of such a system: $Ax<=b$
A=
[ C11, C12, C13]
[ -C21, -C22, -C23]
[ C31, C32, C33]
[ -C41, -C42, -C43]
[ C51, C52, C53]
[ -C61, -C62, -C63]
b=
[ Cb1]
[ Cb2]
[ Cb3]
[ Cb4]
[ Cb5]
[ Cb6]
Pxyz=
[ pX, pY, pZ]
Does Pxyz solve $Ax<=B$ ?
if all(A*Pxyz<=b)
accept point
else
get the closest point to the given Pxyz (by the Euclidean distance)
that solves the system. How?
end
EDIT: For completeness, to find a point that does solve these inequations, one could go through Fourier-Motzkin to find a range for the first variable, choose a value within range, repeat Fourier-Motzkin for the second variable, choose a value within range, repeat Fourier-Motzkin for the third variable, pick last value. Yet, this approach does not necessary provide the closest point to Pxyz that solve the inequations.
The solution can be found via something known as least-squares. The general form of the equation is: $$A^T A \vec x = A^T \vec b$$ In your case: $$\left[ \begin{array}{ccc} a_{11} & a_{12} & a_{13} & a_{14} & a_{15} & a_{16} \\ a_{21} & a_{22} & a_{23} & a_{24} & a_{25} & a_{26} \\ a_{31} & a_{32} & a_{33} & a_{34} & a_{35} & a_{36} \\ \end{array} \right] \times \left[ \begin{array}{ccc} a_{11} & a_{21} & a_{31} \\ a_{12} & a_{22} & a_{32} \\ a_{13} & a_{23} & a_{33} \\ a_{14} & a_{24} & a_{34} \\ a_{15} & a_{25} & a_{35} \\ a_{16} & a_{26} & a_{36} \\ \end{array} \right] \times \left[ \begin{array}{ccc} x_1 \\ x_2 \\ x_3 \\ x_4 \\ x_5 \\ x_6 \end{array} \right]= \left[ \begin{array}{ccc} a_{11} & a_{12} & a_{13} & a_{14} & a_{15} & a_{16} \\ a_{21} & a_{22} & a_{23} & a_{24} & a_{25} & a_{26} \\ a_{31} & a_{32} & a_{33} & a_{34} & a_{35} & a_{36} \\ \end{array} \right] \times \left[ \begin{array}{ccc} b_1 \\ b_2 \\ b_3 \\ b_4 \\ b_5 \\ b_6 \end{array} \right]$$
Per requests, I will expand further upon the solution. Sorry I was a little overly brief, but I was running short on time when I originally wrote the solution, so here is a further explanation.
Based upon my understanding of the OP's question, he wants to know how to figure out if the point he has $\vec x$ is a solution. That is trivial. Simply substitute the appropriate values into the equations, and see if the left-hand side equals the right-hand side.
As far as finding the solution to the closest point, that is the definition of the least squares, since it minimizes the squared error. If the above direct test does not work, the conclusion is that the equations are over-constrained, thus there are at three basis vectors for the system. Thus, the least-squares solution will work.
The results of the multiplication of the two matrices (which I didn't have the time to write out originally) always produce a system of independent equations such that there is exists one unique solution. That solution is the closest point. Because of the size of the expressions when fully expanded, I have switched to matrix notation. $$A^T A \vec x = A^T \vec b \leftrightarrow \left[ A^T A \right] \vec x = A^T \vec b$$ $$\left[ A^T A \right]^{-1} \left[ A^T A \right] \vec x =\left[ A^T A \right]^{-1} A^T \vec b$$ $$\vec x = \left[ A^T A \right]^{-1} A^T \vec b$$