A basis with $e_i\cdot e_j<0$ implies a dual basis with $f_i\cdot f_j>0$?

103 Views Asked by At

I have a basis $\{e_1,e_2,e_3\}\subset\Bbb R^3$ of the 3-dimensional Euclidean space with $e_i\cdot e_j <0$ for all $i\not= j$ (where $\cdot$ denotes the standard inner product).

Question: If $\{f_1,f_2,f_3\}\subset\Bbb R^3$ is the dual basis, what is a quick and clean way to see that $f_i\cdot f_j>0$ for all $i\not=j$?

One approach is the following: Let $E=(e_1,e_2,e_3)\in\Bbb R^{3\times 3}$ the matrix with the $e_i$ as columns, and $F=(f_1,f_2,f_3)$ the same for the dual basis. By assumption, the off-diagonal elements of $E^\top\! E$ are negative, and I want to show that the off diagonal elements of $F^\top\!F$ are positive. Since $E=(F^\top)^{-1}$ we also have that $E^\top\! E=(F^\top\! F)^{-1}$. The latter matrices are positive definite (Gram matrices) and so the question can be asked as follows:

Question: If I have a 3-dimensional positive definite matrix with negative off-diagonal entries, show that the off diagonal entries of the inverse matrix are all positive.


Edit

I found a way and I posted it as an answer, but it makes use of the cross product, some cyclic rotation rules of the scalar triple product and the BAC-CAB rule, which is not particularly pleasing to me: I suspect that the same statement also holds in higher dimensions, but apparently a cross product approach does not generalize to these.

3

There are 3 best solutions below

8
On BEST ANSWER

Let $A=E^T E$ and $B=A^{-1}$. We induct on the dimension. In the $2\times 2$ case, we have $$ A=\begin{bmatrix} a_{11} & a_{12} \\ a_{12} & a_{22}\end{bmatrix}, \quad B=\begin{bmatrix} b_{11} & b_{12} \\ b_{12} & b_{22}\end{bmatrix},$$ and we want to show that $b_{12}\ge 0$. This follows from $BA$ being the identity matrix; indeed, the off-diagonal element of $BA$ is $$ b_{11}a_{12}+b_{12}a_{22}=0,$$ and since $b_{11}>0$ and $a_{22}>0$, while $a_{12}\le 0$, it must be that $b_{12}\ge 0$.

In the general case we partition $$ A=\begin{bmatrix} A_0 & v \\ v^T & a_{nn} \end{bmatrix}, \quad B=\begin{bmatrix} B_0 & w \\ w^T & b_{nn} \end{bmatrix},$$ where $v$ and $w$ are $n$-vectors. We know that each entry of $v$ is nonpositive and we want to show that each entry of $w$ is nonnegative. Again, from $BA=I$ it follows that $$ B_0 v + wa_{nn} = 0. $$ By induction, the entries of $B_0$ are nonnegative. So, $B_0 v$ is a vector of nonpositive numbers. On the other hand, just like before, $a_{nn}>0$. Therefore the vector $w$ must have nonnegative entries, and we are done.

1
On

Let's denote the dual basis by $\{f_{12},f_{23},f_{31}\}\subset\Bbb R^3$ so that $f_{ij}\cdot e_i=f_{ij}\cdot e_j=0$. Up to some factor $\alpha_{ij}$, the vector $f_{ij}$ has the same direction as $e_i\times e_j$, and all $\alpha_{ij}$ have the same sign (depending on the orientation of the basis $e_i$).

But this means, in order to find the sign of $f_{ij}\cdot f_{jk}$ we could also just compute the sign of

\begin{align} (e_i\times e_j)\cdot(e_j\times e_k) &\overset{(*)}= (e_j\times(e_j\times e_k))\cdot e_i \\ &\overset{\smash{(\times)}}= ((e_j\cdot e_k)e_j-(e_j\cdot e_j)e_k)\cdot e_i\\ &= \underbrace{(e_j\cdot e_k)}_{<\,0}\underbrace{(e_k\cdot e_i)}_{<\,0}-\underbrace{(e_j\cdot e_j)}_{>\,0}\underbrace{(e_k\cdot e_i)}_{<\,0} > 0, \end{align}

where in $(*)$ I used a cyclic rotation rule for the scalar triple product and in $(\times)$ I used the BAC-CAB rule.

0
On

Let $A$ positive definite with off-diagonal elements $\le 0$. Let $D$ the diagonal matrix formed with the elements of $A$. Then $\bar A\colon = D^{-1/2} A D^{-1/2}$ is positive definite, has off-diagonal elements $\le 0$, and $1$ on the diagonal. So we may reduce to the case $A$ has $1$ on the diagonal. So $A = I - \Delta$, $A$ positive definite and $\Delta$ symmetric with all elements $\ge 0$.

Now, in general for a symmetric matrix $S$ we have the spectral radius of $\Delta$ equals the largest value of the function $|\sum s_{ij}x_i x_j|$ for $\sum x_i^2=1$. If the matrix has all entries $\ge 0$ this will be achieved for a unit vector with all entries $\ge 0$. Therefore, the spectral radius of a symmetric matrix with positive entries is the largest eigenvalue, which is $\ge 0$.

Back to our problem, since $I - \Delta$ is positive definite, the largest eigenvalue of $\Delta$ is $<1$. It follows that all eigenvalues of $\Delta$ are in absolute value $<1$.

Now, since $\rho(\Delta)< 1$ (the spectral radius), the series $\sum_{n\ge 0} \Delta^n$ is convergent and equals $(I-\Delta)^{-1}$. So the matrix $(I-\Delta)^{-1}$ has all entries $\ge 0$.

Note: The inverse of a positive definite matrix with positive off-diagonal elements may fail to have negative off-diagonal elements if $n\ge 3$.