Convergence of $A_n^{'}A_n$ to $I_k$ in probability implies $A^{-1}_n$ bounded in probability.

176 Views Asked by At

Let $(A_n)$ be a sequence of random $k\times k$ matrices and suppose $A_n^{'}A_n \overset{p}{\to} I_k$. Then

$(i)$ $A_n$ is invertible with probability approaching $1$

$(ii)$ $A_n=O_p(1)$

$(iii)$ $A^{-1}_n=O_p(1)$

I was able to prove $(i)$ using continuity of the determinant, but am having trouble with $(ii)$ and $(iii)$. I know that for an orthogonal $k\times k$ matrix $A$ we have $\sqrt{k}=\left\lVert I_k \right\rVert = \left\lVert A^{'}A\right\rVert = \left\lVert A\right\rVert^2 $ (Frobenius norm) but am not sure how to use it. Any help is greatly appreciated.

Edit $1$: I think I might also need the Neumann series representation $\big(I_k-A_n^{'}A_n\big)^{-1}=\sum_{s=0}^{\infty}(A_n^{'}A_n)^{s}$, provided one show convergence of the series.

Edit 2: Here is an attempt. I will use the following inequalities for the Frobenius norm:

$$\sigma_{min}(A_n)\left\lVert A_n\right\rVert \leq \left\lVert A_n^{'}A_n\right\rVert \leq \sigma_{max}(A_n)\left\lVert A_n\right\rVert $$ or

$$\sqrt{\lambda_{min}(A_n^{'}A_n)}\left\lVert A_n\right\rVert \leq \left\lVert A_n^{'}A_n\right\rVert \leq \sqrt{\lambda_{max}(A_n^{'}A_n)}\left\lVert A_n\right\rVert $$ where $\sigma,\lambda$ denotes singular values and eigenvalues respectively. Now, since eigenvalues are continuous functions on the space of all square matrices and the square root function is continuous, I can invoke the continuous mapping theorem and deduce

$$\sqrt{\lambda_{min}(A_n^{'}A_n)}\overset{p}{\to}\sqrt{\lambda_{min}(I_k)}=1$$ $$\sqrt{\lambda_{max}(A_n^{'}A_n)}\overset{p}{\to}\sqrt{\lambda_{max}(I_k)}=1$$ Similarly, the Frobenius norm is a continuous function on the space of all matrices so we have

$$\left\lVert A_n^{'}A_n\right\rVert \overset{p}{\to} \left\lVert I_k\right\rVert=\sqrt{k}$$

Rearranging the previous inequality we get

$$\frac{\left\lVert A_n^{'}A_n\right\rVert }{\sqrt{\lambda_{min}(A_n^{'}A_n)}} \leq \left\lVert A_n\right\rVert \leq \frac{\left\lVert A_n^{'}A_n\right\rVert }{\sqrt{\lambda_{max}(A_n^{'}A_n)}} $$

with division justified by the fact that $\sqrt{\lambda_{min}(A_n^{'}A_n)}>0$ and $\sqrt{\lambda_{max}(A_n^{'}A_n)}>0$ with probability approaching one. By the previous arguments, both the left and right side converge in probability to $\sqrt{k}$, which implies that $\left\lVert A_n\right\rVert\overset{p}{\to}\sqrt{k}$. Since convergence in probability implies boundedness in probability, we have shown $(ii)$.

Finally, for $(iii)$ I can use $(i)$ and the fact that the inverse is a continuous function on the space of invertible matrices to deduce $(A_n^{-1})(A_n^{-1})^{'}\overset{p}{\to}I_k$. I can then apply the same argument as in $(ii)$ to conclude that $A_n^{-1}$ is bounded in probability.

Is this proof correct?

1

There are 1 best solutions below

0
On

What is your definition of convergence in probability for matrices? One possible definition is that $\|A_k' A_k - I_k\|_{op}\to 0$ in probability, i.e., $\forall \varepsilon>0, P(\|A_k' A_k - I_k\|_{op}>\varepsilon)\to 0$. Here $\|\cdot\|_{op}$ is the operator norm.

Now $\|A_k' A_k - I_k\|_{op}= \max_{i=1,...,k} |s_i^2-1|$ where $s_1\ge s_2\ge...\ge s_k$ are the singular values of $A_k$. In particular you have $\|A_k\|_{op} = s_1$ and $A_k$ is invertible with $\|A_k^{-1}\| = s_k^{-1}$ whenever $s_k>0$. Intuitively, this will work out since all singular values converge to 1 in probability.

Rigorously, apply the definition of convergence in probability to $\varepsilon = 1/2$. Then $\max_{i=1,...,k} |s_i^2-1| \le 1/2$ with probability approaching one, which implies that $P(\sqrt{1/2} \le s_k \le s_1 \le \sqrt{3/2})\to 1$. This gives $\|A_k\|_{op} = s_1 = O_P(1)$ as wel as $\|A_k^{-1}\|_{op} = O_P(1)$ by writing down the definition of $O_P(1)$.