Let $A\in\mathcal{M}_n(\mathbb{R}),\ n\in\mathbb{N}$ and $X\in\mathbb{R}^n$. For any $\lambda\in\mathbb{R}$, define $M(\lambda)=\left( \begin{array} {c|c}A & X\\ \hline\phantom{}^tX & \lambda \end{array} \right)$.
How can I show that the following two conditions are equivalent?
(a) There exists $\lambda\in\mathbb{R}$ such that $M(\lambda)={^t}M(\lambda)$ and $M(\lambda) ^tM(\lambda)=I_{n+1}$.
(b) $A={^t}A$ and $A{^t}A+X{^t}X=I_n$.
Thank you.
We can easily prove that (a) implies (b) by using
$M(\lambda)={^t}M(\lambda)=\left( \begin{array} {c|c}^tA & X\\ \hline\,^tX & \lambda \end{array} \right)$ and $I_{n+1}=M(\lambda){^t}M(\lambda)=\left( \begin{array} {c|c}A{^t}A+X{^t}X & AX+\lambda X\\ \hline\,^tX{^t}A+\lambda{^t}X & ^tXX+\lambda^2 \end{array} \right).$
What about (b) $\Rightarrow$ (a) ?
Note first that if $X=0$, then we can take $\lambda=\pm 1$ and we're done.
Suppose now that $X\ne 0$. By the Spectral Theorem, a symmetric matrix $A$ has real eigenvalues. We want to show that if $AA^\top + XX^\top = A^2 + XX^\top = I$, then we can choose $\lambda$ so that $AX=-\lambda X$ and $\|X\|^2(1 + \lambda^2) = 1$. Note that $A^2X = (1-\|X\|^2)X$, so $X$ is an eigenvector of $A^2$. The eigenvalues of $A^2$ are the squares of the (real) eigenvalues of $A$, and you can check that $A$ and $A^2$ have the same eigenvectors. So $1-\|X\|^2\ge 0$, which means that we can take $-\lambda = \pm\sqrt{1-\|X\|^2}$ to be an eigenvalue of $A$ with eigenvector $X$. We're done.