I have a several 3D vectors $X_{i}$ which lay approximately in a plane. Now I need to find a single vector, which is normal (as much as possible) to all of them.
For two vectors, I can use a cross product to obtain such vector.
For multiple vectors, I found this formula:
$$\left( \sum_{i=1}^{n} X_{i}X_{i}^{\textbf{T}} \right)u=0$$
Here $u$ is the desired vector (null vector of a covariance matrix).
Question: What is the reason behind this formula?
The formula you found will have no solution unless all the $X_i$ are actually in the same plane! Instead, you should find the $u$ that minimizes $u^T\big(\sum\limits_{i=1}^n X_i X_i^T\big)u$. This is the least-squares solution to requiring $X_i^Tu = 0$ for all $X_i$ (I can elaborate on this if you want). The desired $u$ is the eigenvector corresponding to the smallest eigenvalue of $\sum\limits_{i=1}^n X_i X_i^T$.
Define the $3\times n$ matrix $X$ whose columns are the vectors $X_i$. Then $\sum\limits_{i=1}^n X_iX_i^T=XX^T$, so we're trying to minimize $u^TXX^Tu=\lVert X^Tu\rVert^2$ for a unit vector $u\in\mathbb R^3$. The solution is the eigenvector of $XX^T$ corresponding to the smallest eigenvalue; equivalently, the right singular vector of $X^T$ ($=$ the left singular vector of $X$) corresponding to the smallest singular value. Finding singular vectors of $X$ is more numerically stable than computing $XX^T$ and finding its eigenvectors, so using the SVD is recommended.