Prove that $B$ is non singular and that $AB^{-1}A=A$

380 Views Asked by At

$$A_{n\times n}=\begin{bmatrix}a & b & b & b &. &.&.&&b\\b & a &b&b&.&.&.&&b\\b & b &a&b&.&.&.&&b\\b & . &.&.&.&.&.&&b\\b & . &.&.&.&.&.&&b\\b & b &b&b&.&.&.&&a\end{bmatrix}\text{ where } a+(n-1)b =0$$

Define $l^t=\begin{bmatrix}1&1&1&1&1&....1\end{bmatrix}$ Where $l$ is a $ n\times1$ vector, and:

$$B= A+ \frac{l\cdot l^t}{n}$$

Prove that $B$ is non singular and that $AB^{-1}A=A$

What i did:

$\text{A has a 0 eigenvalue , so A is a singular matrix}$ $\text{B has an eigenvalue of 1 with eigenvector} $$\,\, v^{t}= \begin{bmatrix}1&1&1&1&1&....1\end{bmatrix}$

Any idea about how to proceed?

Thanks.

3

There are 3 best solutions below

3
On BEST ANSWER

Let $\mathbf{1}_n$ denote the $n\times n$ matrix with all entries equal to one and $E_n$ denote the $n\times n$ identity matrix. Then $$ A_n = b\mathbf 1_n + (a-b)E_n = b\mathbf 1_n -nbE_n $$ and $$ B_n = A_n + \frac{1}{n}\mathbf 1_n = \left(b+\frac{1}{n}\right)\mathbf 1_n - nbE_n. $$

In general, for a matrix $X_n=\alpha \mathbf 1_n + \beta E_n$, that is $$ X = \begin{pmatrix} \alpha+\beta & \alpha & \alpha & \cdots & \alpha \\ \alpha & \alpha+\beta & \alpha & \cdots & \alpha \\ \alpha & \alpha & \alpha+\beta & \cdots & \alpha \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ \alpha & \alpha & \alpha & \cdots & \alpha+\beta \end{pmatrix}, $$ we can calculate the determinant by first subtracting the second row from the first to get

$$ \det X_n = \det \begin{pmatrix} \beta & -\beta & 0 & \cdots & 0 \\ \alpha & \alpha+\beta & \alpha & \cdots & \alpha \\ \alpha & \alpha & \alpha+\beta & \cdots & \alpha \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ \alpha & \alpha & \alpha & \cdots & \alpha+\beta \end{pmatrix}, $$ and then Laplace expand with respect to the first row to obtain $$ \det X_n = \beta \det X_{n-1} + \beta \det \begin{pmatrix} \alpha & \alpha & \cdots & \alpha \\ \alpha & \alpha+\beta & \cdots & \alpha \\ \vdots & \vdots & \ddots & \vdots \\ \alpha & \alpha & \cdots & \alpha+\beta \end{pmatrix}. $$ For the second matrix, subtract the first row from all others to get $$ \det X_n = \beta \det X_{n-1} + \beta \det \begin{pmatrix} \alpha & \alpha & \cdots & \alpha \\ 0 & \beta & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & \beta \end{pmatrix} = \beta \det X_{n-1} + \alpha \beta^{n-1} $$ Since $\det X_1=\alpha+\beta$, this recursion yields $$\det X_n=n\alpha\beta^{n-1} +\beta^n=\beta^{n-1}(n\alpha+\beta).$$ Thus, $X_n$ is invertible if and only if $\beta\neq 0$ and $n\alpha+\beta\neq 0$. In this case, we may guess that (or consider the adjugate to see that) $X^{-1}$ is of the form $X^{-1} = \gamma \mathbf 1 + \delta E_n$ as well and work out that $$ X^{-1} = -\frac{\alpha}{\beta(n\alpha+\beta)} \mathbf 1_n + \frac{1}{\beta} E_n. $$

In the example at hand, $$ \det B_n = (-nb)^{n-1}(n\left(b+\frac{1}{n}\right)-nb)= (-nb)^{n-1} $$ which is non-zero if and only if $b$ is non-zero. And $$ B_n^{-1} = \frac{nb+1}{n^2b} \mathbf 1_n - \frac{1}{nb} E_n. $$

However, to check that $AB^{-1}A=A$ is satisfied it is enough to show $A^2=AB$, since all matrices of the given form commute.

0
On

This answer concerning non-singularity of $B$ is based on notation used by Christoph and it is inspired by him.
The determinant can be calculated also in other way with the use of eigenvalues.

We have $$ B_n = \left(b+\dfrac{1}{n}\right)\mathbf 1_n - nbE_n. $$

Eigenvalues $\lambda_i$ of $\mathbf 1_n$ (it is rank $1$ symmetric matrix) are:
the single value equal to $n$ (with eigenvector $[ 1 \ \ 1 \ \ \dots \ \ 1]^T$) and $n-1$ values of $0$.

$B_n$ is polynomial $p(\mathbf 1_n)$ so its eigenvalues are $p(\lambda_i)$.

In this case $n$ is transformed into $\left(b+\dfrac{1}{n}\right)n-nb=1$ and zeros are transformed into $-nb $. Therefore determinant (product of eigenvalues) is equal $(-nb)^{n-1}$.

Finally one can explain in a different way also why inverse of matrix $B$ has also form $c_i\mathbf 1_n+d_iE_n$, what leads to calculations of coefficients $c_i,d_i$.

It follows directly from the fact the every inverse can be expressed as polynomial of its matrix (what can be obtained from characteristic equation and Cayley-Hamilton theorem).

Now because powers of $\mathbf 1_n$ are matrices of the form $t\mathbf 1_n$ (what is easy to check) , where $t$ is some scalar, then also polynomial of $B$ has to have form $c_p\mathbf 1_n+d_pE_n$.

8
On

In this new answer I would like to propose some other approach to the second part of the question without direct calculating inverse of $B$.

We know form of $A$ and $B$ $$ A = b\mathbf 1_n -nbE $$

$$ B = \left(b+\dfrac{1}{n}\right)\mathbf 1_n - nbE_n. $$

They are both polynomials of the same matrix so their multiplication is commutative. Also $B^{-1}$ can be expressed as such polynomial.

The solution below exploits simply multiplication commutativity of type $a_1\mathbf 1_n+a_0E$ matrices. Taking this into account the equation $AB^{-1}A=A$ can be transformed to the much more friendly form.

$AAB^{-1}=A$
$AA=AB$
$AA-AB=0$
$A(A-B)=0$

The last equation can be written down as $\dfrac{1}{n}(b\mathbf 1_n -nbE)\mathbf 1_n =0$ and taking into account that $\mathbf 1_n^2 =n\mathbf 1_n$ indeed $b\mathbf 1_n-b\mathbf 1_n=0$.