Decomposition of matrix occuring in problem of finding $n+1$ vectors in $\mathbb{R}^n$ with pairwise equal inner product

147 Views Asked by At

I was toying with my intuition that there are always $n+1$ unit vectors in $\mathbb{R}^n$ such that every pair $v_i$ and $v_j$ ($i\neq j$) has the same angle between them. As those vectors are normalized this is equivalent to $$v_i\cdot v_j=\begin{cases} 1,& i=j \\m, & i\neq j \end{cases}$$ with $|m|<1$. So if $V$ is the $n\times (n+1)$ matrix whose columns are formed by the $v_i$ we have $$M:=V^TV=\begin{pmatrix}1 & m & \cdots &m \\m&1&\cdots&m\\ \vdots &&\ddots&\vdots \\ m&m&\cdots&1 \end{pmatrix}$$ $M$ is a $(n+1)\times (n+1)$ matrix and must be singular (as it is the product of two rank $n$ matrices). The sum of its rows is $(nm+1,nm+1,\ldots,nm+1)$. For $m=-\frac{1}{n}$ this is the null vector giving the required dot product $m$ as a function of $n$. E.g., for $n=2$, $m=-\frac{1}{2}$ corresponding to the expected angle of $120°=\arccos{-\frac{1}{2}}$. Now I'm asking is there a matrix decomposition so that I can get $V$ back from the product $M$? (I know that $V$ is not unique since I can freely rotate the vectors $v_i$ about an arbitrarily chosen fixed axis without changing their pairwise dot product.) I tried spectral decomposition of $M$ but that gives $n+1$ dimensional vectors and the $n$ nonzero eigenvalues of $M$ are all equal to $\frac{n+1}{n}$ which means that any linear combination of eigenvectors corresponding to that eigenvalue is also an eigenvector making the dot product between those eigenvectors rather arbitrary.

2

There are 2 best solutions below

4
On BEST ANSWER

In ${\mathbb R}^n$, having $n+1$ unit vectors such that the angle between any two of them be equal
implies that the vectors individuate the vertices of a regular n-simplex inscribed in the unit sphere.
That means that the cosine of the angle (your $m$) shall be $-1/n$.

$n$ of the vectors will be independent, while the (n+1)-th one will necessarily be dependent.
Therefore the $(n+1) \times (n+1)$ matrix $$ M = V^{\,T} V $$ will have null determinant.
But a $n \times n$ diagonal submatrix of it ($M'$)will be full-rank, symmetric (Hermitean) and positive definite, being a matrix of inner products.
So it admits a Cholesky decomposition $$ M' =L L^{\,T} $$ which is unique and therefore we shall have $$ M' = L\,L^{\,T} = V'^{\,T} V'\quad \Rightarrow \quad V' = L^{\,T} $$ where $V'$ is the $n \times n$ matrix of $n$ of the (column) vectors.

The additional $n+1$ vector $v$ will be derived by solving $$ V'^{\,T} v = Lv = - \frac{1}{n}\left( {\begin{array}{*{20}c} 1 \\ 1 \\ \vdots \\ 1 \\\end{array}} \right) $$

For example, for ${\mathbb R}^3$ we get $$ \begin{array}{l} M' = \left( {\begin{array}{*{20}c} 1 & { - 1/3} & { - 1/3} \\ { - 1/3} & 1 & { - 1/3} \\ { - 1/3} & { - 1/3} & 1 \\ \end{array}} \right) = \\ = L\,L^{\,T} = \left( {\begin{array}{*{20}c} 1 & 0 & 0 \\ { - \frac{1}{3}} & {\frac{{2\sqrt 2 }}{3}} & 0 \\ { - \frac{1}{3}} & { - \frac{{\sqrt 2 }}{3}} & {\frac{{\sqrt 3 \sqrt 2 }}{3}} \\ \end{array}} \right)\left( {\begin{array}{*{20}c} 1 & { - \frac{1}{3}} & { - \frac{1}{3}} \\ 0 & {\frac{{2\sqrt 2 }}{3}} & { - \frac{{\sqrt 2 }}{3}} \\ 0 & 0 & {\frac{{\sqrt 3 \sqrt 2 }}{3}} \\ \end{array}} \right) \\ \end{array} $$ and $$ V = \left( {\begin{array}{*{20}c} 1 & { - \frac{1}{3}} & { - \frac{1}{3}} & { - \frac{1}{3}} \\ 0 & {\frac{{2\sqrt 2 }}{3}} & { - \frac{{\sqrt 2 }}{3}} & { - \frac{{\sqrt 2 }}{3}} \\ 0 & 0 & {\frac{{\sqrt 3 \sqrt 2 }}{3}} & { - \frac{{\sqrt 3 \sqrt 2 }}{3}} \\ \end{array}} \right) $$

1
On

Adding to the existing answer, you can also extract your vectors by orthogonally diagonalizing $M$. The matrix $M$ has rank $n$ with two eigenvalues: The eigenvalue $\lambda := \frac{n+1}{n}$ of multiplicity $n$ and the eigenvalue $0$ of multiplicity one. Let $U$ be an orthogonal matrix such that

$$ U \begin{pmatrix} \lambda I_{n \times n} & 0_{n \times 1} \\ 0_{1 \times n} & 0 \end{pmatrix} U^T = M. $$

The first $n$ columns of $U$ form an orthonormal basis for the eigenspace of $M$ associated to $\lambda$ which is $$\{ (x_1, \dots, x_{n+1})^T \, | \, x_1 + \dots + x_{n+1} = 0 \}. $$

Write $$ U = \begin{pmatrix} \hat{U}_{n \times n} & x_{n \times 1} \\ y_{1 \times n} & z \end{pmatrix}, V = \begin{pmatrix} \hat{V}_{n \times n} & v_{n+1} \end{pmatrix}. $$

The equation $V^T V = M$ then becomes $$ \begin{pmatrix} \hat{V}^T \hat{V} & \hat{V}^T v_{n+1} \\ v_{n+1} V^T & v_{n+1}^T v_{n+1} \end{pmatrix} = \lambda \begin{pmatrix} \hat{U} \hat{U}^T & \hat{U} y^T \\ y \hat{U}^T & y \hat{y}^T \end{pmatrix}. $$

Hence if you take $\hat{V} = \sqrt{\lambda} \hat{U}^T, v_{n+1} = \sqrt{\lambda} y^T$ you get a solution. In other words, if you take the $n + 1$ rows of the eigenvector matrix $U$ (whose columns are the eigenvectors), forget the last coordinates and transpose, you get a solution. Since the eigenspace assosicated to the eigenvalue $\lambda$ is $n$-dimensional, the matrix $$ \begin{pmatrix} \hat{U} \\ y \end{pmatrix} $$ is determined up to an orthogonal transformation of an $n$-dimensional subspace which corresponds to the freedom in rotating the columns of $V$ in an $n$-dimensional space.


Example: For $n = 2$, one possible decomposition is $$ M = \begin{pmatrix} 1 & -1/2 & -1/2 \\ -1/2 & 1 & -1/2 \\ -1/2 & -1/2 & 1 \end{pmatrix} = \underbrace{\begin{pmatrix} \frac{1}{\sqrt{2}} & \frac{1}{\sqrt{6}} & \frac{1}{\sqrt{3}} \\ 0 & -\frac{2}{\sqrt{6}} & \frac{1}{\sqrt{3}} \\ -\frac{1}{\sqrt{2}} & \frac{1}{\sqrt{6}} & \frac{1}{\sqrt{3}} \end{pmatrix}}_{U} \begin{pmatrix} \frac{3}{2} & 0 & 0 \\ 0 & \frac{3}{2} & 0 \\ 0 & 0 & 0 \end{pmatrix} \underbrace{\begin{pmatrix} \frac{1}{\sqrt{2}} & 0 & -\frac{1}{\sqrt{2}} \\ \frac{1}{\sqrt{6}} & -\frac{2}{\sqrt{6}} & \frac{1}{\sqrt{6}} \\ \frac{1}{\sqrt{3}} & \frac{1}{\sqrt{3}} & \frac{1}{\sqrt{3}} \end{pmatrix}}_{U^T} $$

and the corresponding vectors are: $$ v_1 = \sqrt{\frac{3}{2}} \begin{pmatrix} \frac{1}{\sqrt{2}} & \frac{1}{\sqrt{6}} \end{pmatrix}^T = \begin{pmatrix} \frac{\sqrt{3}}{2} \\ \frac{1}{2} \end{pmatrix}, \\ v_2 = \sqrt{\frac{3}{2}} \begin{pmatrix} 0 & -\frac{2}{\sqrt{6}} \end{pmatrix}^T = \begin{pmatrix} 0 \\ -1 \end{pmatrix}, \\ v_3 = \sqrt{\frac{3}{2}} \begin{pmatrix} -\frac{1}{\sqrt{2}} & \frac{1}{\sqrt{6}} \end{pmatrix}^T = \begin{pmatrix} -\frac{\sqrt{3}}{2} \\ \frac{1}{2} \end{pmatrix} $$ enter image description here