Conditions for existence of $k$ vectors In $\mathbb R^k$ with given pairwise angles $\theta_{ij}$

213 Views Asked by At

Given angles $0<\theta_{ij}<\pi$ for $1\leq i<j\leq k$, what conditions are there on the angles to ensure that there exists $k$ unit vector $v_i\in \mathbb R^k$ so that the angle between $v_i$ and $v_j$ is $\theta_{ij}$?

There are clearly problems when $\theta_{12},\theta_{13},\theta_{23}$ are all close to $\pi$. What if I can ensure, for a given $\epsilon>0$ that $|\theta_{ij}-\frac{\pi}2|<\epsilon?$

Intuitively, this is true for $k=3$ and the angles close to $\frac{\pi}2$. We can easily pick $v_1,v_2$ and the locus of points for $v_3$ meeting the angle requirement with $v_1$ is a near-great circle in the unit sphere. With $v_2$, the same. And these two near-great circles are near-perpendicular. We need them to intersect for $v_3$ to be found.

But I can’t prove it, and my intuition for $k>3$ spheres is negligible.

This is a possible solution for this question. (In fact, I only need $k>3$ to complete that question - I’ve got an entirely different solution for $k=3$ in that question.)


I think a minimum necessary condition is, for all distinct $i,j,k$: $$\theta_{ij}+\theta_{jk}\geq \theta_{ik}.\tag 1$$ (Here we need $\theta_{ij}=\theta_{ji}$ for $i>j$ to include all the correct cases in (1).)

Also, probably: $$\theta_{ij}+\theta_{jk}+\theta_{ik}\leq 2\pi\tag 2$$

It seems like, when $k=3$, (1) and (2) should be enough.

3

There are 3 best solutions below

1
On BEST ANSWER

$\def\vec{\boldsymbol}\def\v{\vec{v}}\def\x{\vec{x}}\def\R{\mathbb{R}}\def\Ω{{\mit Ω}}\def\<{\langle}\def\>{\rangle}\DeclareMathOperator{\diag}{diag}$On the one hand, if there exists such $\v_1, \cdots, \v_n$, define $V = [\v_1, \cdots, \v_n] \in \R^{n × n}$, then$$ V^T V = \begin{bmatrix} \<\v_1, \v_1\> & \cdots & \<\v_1, \v_n\>\\ \vdots & \ddots & \vdots\\ \<\v_n, \v_1\> & \cdots & \<\v_n, \v_n\> \end{bmatrix} = \begin{bmatrix} 1 & \cos θ_{1, 2} & \cdots & \cos θ_{1, n}\\ \cos θ_{2, 1} & 1 & \ddots & \cos θ_{2, n}\\ \vdots & \ddots & \ddots & \vdots\\ \cos θ_{n, 1} & \cos θ_{n, 2} & \cdots & 1 \end{bmatrix} =: A $$ is a positive semidefinite matrix.

On the other hand, if $A \geqslant 0$, then there exists an orthogonal matrix $\Ω$ and $D = \diag(λ_1, \cdots, λ_n)$ such that $λ_1, \cdots, λ_n \geqslant 0$ and $A = \Ω^T D\Ω$. Take $V = \sqrt{D}\Ω$ and $\v_k$ as the $k$-th column of $V$, then such $\v_k$'s satisfies $\dfrac{\v_i · \v_j}{\|\v_i\| \|\v_j\|} = \cos θ_{i, j}$ for any $i ≠ j$. (When $A > 0$, $V$ can also be computed by Cholesky decomposition.) Therefore the given condition is equivalent to $A \geqslant 0$.

Now, an easy-to-check condition to ensure $A \geqslant 0$ is\begin{gather*} \sum_{\substack{1 \leqslant j \leqslant n\\j ≠ i}} |\cos θ_{i, j}| \leqslant 1.\quad \forall 1 \leqslant i \leqslant n \tag{1} \end{gather*} In fact, when (1) is true, for any $\vec{x} \in \R^n$,\begin{gather*} \x^T A \x = \sum_{k = 1}^n x_k^2 + 2 \sum_{i < j} \cos θ_{i, j} x_i x_j\\ \geqslant \sum_{k = 1}^n x_k^2 - \sum_{i < j} |\cos θ_{i, j}| (x_i^2 + x_j^2) = \sum_{i = 1}^n \biggl( 1 - \sum_{\substack{1 \leqslant j \leqslant n\\j ≠ i}} |\cos θ_{i, j}| \biggr) x_i^2 \geqslant 0. \end{gather*} An even simpler condition implied by (1) is\begin{gather*} \max_{1 \leqslant i < j \leqslant n} |\cos θ_{i, j}| \leqslant \frac{1}{n - 1}, \tag{2} \end{gather*} or\begin{gather*} \max_{1 \leqslant i < j \leqslant n} \left| θ_{i, j} -\frac{π}{2} \right| \leqslant \arcsin\frac{1}{n - 1}. \tag{2$'$} \end{gather*}

7
On

Note: It's not the final answer. It provides a necessary condition (but not sufficient condition).

Let us represent the unit vectors $\{\boldsymbol{v}_i \}_{1 \le i \le k} \in \Bbb R^k$, begining from the origine $\boldsymbol{O}$, by $\boldsymbol{v}_i = (x_{i1},\ldots,x_{ik})'$ for $1\le i \le k$. We have then $$||\boldsymbol{v}_i||=\sum_{p=1}^k x_{ip}^2=1 \qquad \text{for } 1 \le i \le k \tag{1}$$

The angles $\theta_{ij}$ between two vectors $\boldsymbol{v}_i$ and $\boldsymbol{v}_j$ is determined by the dot product as follows $$\cos(\theta_{ij})=\frac{\boldsymbol{v}_i.\boldsymbol{v}_j}{||\boldsymbol{v}_i||.||\boldsymbol{v}_j||}=\boldsymbol{v}_i.\boldsymbol{v}_j=\sum_{p=1}^kx_{ip}x_{jp} \qquad \text{for } 1 \le i< j \le k \tag{2}$$

In total, we have $\frac{k(k-1)}{2}$ equations $(2)$ for each pair $(\boldsymbol{v}_i,\boldsymbol{v}_j)_{1 \le i< j \le k}$.

First necessary condition: Summing $k$ equations $(1)$ and twice $\frac{k(k-1)}{2}$ equations $(2)$ (in total, we have $k + 2\times \frac{k(k-1)}{2} = k^2$ terms), we have $$k+2 \times\sum_{1 \le i < j\le k} \cos(\theta_{ij})=\sum_{h=1}^k\left(\sum_{i=1}^k x_{ih} \right)^2 \ge 0$$ or $$\sum_{1 \le i < j\le k} \cos(\theta_{ij}) \ge -\frac{k}{2} \tag{3}$$

Remark: if we have $\theta_{ij} = \theta$ for $1 \le i < j\le k$, then we can deduce from $(3)$ that

$$0 \le \theta \le \text{arcos}\left( -\frac{1}{k-1} \right)$$

The equality occurs when $k$ equations $(1)$, $\frac{k(k-1)}{2}$ equations $(2)$ and $k$ equations $(4)$ are satisfied. $$\sum_{i=1}^k x_{ih} = 0 \qquad \text{for } 1 \le h \le k \tag{4}$$

(In this case, we have $\frac{k(k+3)}{2}$ equations/constraints for $k^2$ variables $\{x_{ij}\}_{1\le i,j\le k}$).

Second necessary condition: Applying the triangle inequality for $ i \ne j \ne h \ne i $

$$\sqrt{\sum_{p=1}^k (x_{ip}-x_{hp})^2}+\sqrt{\sum_{p=1}^k (x_{jp}-x_{hp})^2} \ge \sqrt{\sum_{p=1}^k (x_{ip}-x_{jp})^2}$$

then $$\sin(\frac{\theta_{ih}}{2}) +\sin(\frac{\theta_{jh}}{2}) \ge \sin(\frac{\theta_{ij}}{2}) \quad \text{for } i \ne j \ne h\tag{5}$$ (we have $\frac{k(k-1)(k-2)}{6}$ constraints of type $(5)$)

Conclusion: $(3), (5)$ are 2 necessary conditions.

7
On

Let us represent the vectors in question as directed line segments $\{\overrightarrow{OA}_i\}_{1 \le i \le k} $ from the origin $O$ to the points $A_i$ on the unit sphere $\mathcal{B}(O,1)$.

Remark: perhaps it may be necessary to create apoint $A_0$ to be able to apply the Cayley-Menger determinant as we need $k+1$ points in the space $\Bbb R^k$. If yes, the point $A_0$ can be created as the symmetric point of $A_1$. The angle between $\overrightarrow{OA}_0$ and $\overrightarrow{OA}_i$ (for $1 \le i \le k$) is $\theta_{0i} = \pi-\theta_{0i}$ for $i=1,...,k$.

The distance between two points $A_i$ and $A_j$ is then determined by the angle $\theta_{ij}$ via the law of cosines. I call it $a_{ij}$: $$a_{ij} =|A_iA_j| = \sqrt{2-2\cos(\theta_{ij})} = 2\sin\left(\frac{\theta_{ij}}{2} \right) \qquad \text{for } 0 \le i< j \le k\tag{1}$$

Of course, the distance from $O$ to each $A_i$ is also fixed at $1$. Your question now stands thus: can we construct points in $\mathbb{R}^k$ with specified distances (to each other, and to $O$)?

The answer is entirely determined by the Cayley-Menger determinant $\Gamma$, which calculates the area of the simplex marked out by those points. This nice exposition mentions that it can be used on page $2$, citing Theorem 9.7.3.4 of Berger's Geometry I :

Screenshot of theorem 9.7.3.4 from Berger's Geometry I

To summarize the argument:

  • Work by induction on $k$ with two base cases.
  • Construct $k-2$ points in a $(k-2)$-dimensional subspace by hypothesis. Henceforth, I'm going to pretend the $k-2$ points are just the single point $O$, and quotient out by their $(k-2)$-dimensional subspace. I'm also going to choose an arbitrary splitting of the complementary subspace $\mathbb{R}^2=Y\oplus W$.
  • Adding in one more point is easy: the inductive hypothesis tells us we can do it in any $(k-1)$-dimensional subspace. So, put $A_{k-1}\in Y$.
  • Now, for any codimension-$1$ subspace ("line") $L$, there are two possible places for $A_k$ (ignoring our requirement on $|A_{k-1}A_k|$). In fact, as we rotate $L$, we get a circle.
  • The distance $x=|A_{k-1}A_k|$ still isn't right, but now we can fix it. The two examples when $L=Y$ are extremal: any distance between these two values of $x$ is feasible by finding the correct point on the circle. Those examples are also zeros of $\Gamma$.
  • $\Gamma$ is a quadratic in $x$. So $\Gamma$ has a specific sign at values of $x$ between the extreme examples. The sign condition given in the theorem statement ensures we are in that region.

Q.E.D.

From this theorem, you have necessary and sufficient conditions for the existence of the $k$ vectors in terms of distances, and from (1) you have necessary and sufficient conditions in terms of the angles.