Proving:$\operatorname{Proj}_{U^\perp}(x)=-\frac1{\det(A^TA)} X(u_1,\ldots, u_{n-2}, X(u_1,\ldots, u_{n-2}, x))$

125 Views Asked by At

The problem I'm trying to solve is as follows, which was posed to me by my professor as an exercise:

Let $x, u_i \in \Bbb R^n$, $ A = (u_1, u_2, \ldots, u_{n-2})$ and $\{u_1, u_2, \ldots, u_{n-2}\}$ is linearly independent. Let $U = \text{Col}(A)$. Then, show $\operatorname{Proj}_{U^\perp}(x) =- \frac1{\det(A^{T}A)} X(u_1,\ldots, u_{n-2}, X(u_1, \ldots, u_{n-2}, x))$.


Here is my proof so far:

We want to show that $$x - \operatorname{Proj}_U(x) = -\frac1{\det(A^TA)} X(u_1,\ldots, u_{n-2}, X(u_1,\ldots, u_{n-2}, x)).$$ Equivalently, $\begin{aligned}x - A(A^TA)^{-1}A^Tx &= -\frac1{\det(A^{T}A)} X(u_1, \ldots, u_{n-2}, X(u_1,\ldots, u_{n-2}, x)) \\\iff A\text{ adj}(A^TA)A^Tx - x\det(A^TA) &= X(u_1,\ldots, u_{n-2}, X(u_1, \ldots, u_{n-2}, x)).\end{aligned}$

Now, I've used a fact proven in class that $$\begin{aligned}&\ X(u_1, \ldots,u_{n-2}, X(u_1,\ldots, u_{n-2}, x))\\&=\left(\sum\limits_{i=1}^{n-2} (-1)^{n+i} \det((B^TA)^{(i)})u_i\right) - \det((B^TA)^{(n-1)})x\end{aligned}$$

where $B = (u_1, u_2, \ldots, u_{n-2}, x)$ and $(B^TA)^{(i)}$ is obtained by removing the $i-\text{th}$ row of $B^TA$.

Observing that $(B^TA)^{(n-1)} = A^TA$, we can rewrite the goal, so now we need to show that $$A\text{ adj}(A^TA)A^Tx = \left(\sum\limits_{i=1}^{n-2} (-1)^{n+i}\det((B^TA)^{(i)})u_i\right).$$

This is about where I am out of ideas on how to proceed. I think I am onto something, but I am not sure how to prove this last goal.


Any observations, hints, or solutions would be very much appreciated!

1

There are 1 best solutions below

0
On BEST ANSWER

The (orthogonal) projection $\operatorname{proj}_V$ onto a linear subspace $V\subset \mathbb R^N$ is uniquely determined by the following properties:

$$ 0. \operatorname{proj}_V \text{ is a linear map} \qquad 1.\; \operatorname{proj}_V\Big\vert_{V} = \operatorname{id}_{V} \qquad 2.\; \operatorname{proj}_V\Big\vert_{V^\perp} = 0_{V^\perp}$$

We can easily verify that your function satisfies these properties. I assume that with $X$ you mean the generalized cross product, which is the alternating, multilinear function that maps $(n-1)$ linearly independent vectors in $\mathbb R^n$ onto a vector which is orthogonal to all of them; and to $0$ if they happen to be linearly dependent. To distinguish it I will write $\times$ instead. This map is uniquely defined by the property:

$$ \forall y: \langle \underbrace{\times(z_1, \ldots, z_{n-1})}_{=:z_n}\mid y \rangle = \det([z_1|\ldots|z_{n-1}|y]) $$

In particular, it follows that $\|z_n\|^2 =\langle z_n\mid z_n\rangle =\det([z_1|\ldots|z_{n-1}|z_n])$, i.e. $\|z_n\| = \det([z_1|\ldots|z_{n-1}|\hat e_{z_n}])$ with $\hat e_{z_n} = z_n\big/\|z_n\|$

Part 0. Linearity follows directly from the multilinearity of the $\times$ operator.

Part 1. Assume $x\in U$, then $\times(u_1, u_2, \ldots, u_{n-2}, x)=0$ and consequently $\operatorname{proj}_{U^\perp}(x)=0$.

Part 2. Assume $x\in U^{\perp}$, then choose $z\perp x$ such that $\{\hat e_x,\hat e_z\}$ is a orthonormal basis of $U^\perp$. Then:

  1. $\times(u_1,\ldots, u_{n-2}, \hat e_x) = \lambda \hat e_z$ with $\lambda=\det([u_1|\ldots| u_{n-1}| \hat e_x| \hat e_z])$
  2. $\times(u_1,\ldots, u_{n-2}, \hat e_z) = \mu \hat e_x$ with $\mu =\det([u_1|\ldots| u_{n-1}| \hat e_z | \hat e_x]) = -\lambda$

$$\begin{aligned} \operatorname{proj}_{U^\perp}(x) &= -\frac{1}{\det(A^T\!A)}\times(u_1,\ldots, u_{n-1}, \times(u_1,\ldots, u_{n-1}, x)) \\&\overset{(1)}{=} -\frac{\|x\|\lambda}{\det(A^T\!A)}\times(u_1,\ldots, u_{n-1}, \hat e_z) \\&\overset{(2)}{=} +\frac{\|x\|\lambda^2}{\det(A^T\!A)}\hat e_x = x \end{aligned}$$

where the last step follows from, using $C = [u_1|\ldots| u_{n-1}| \hat e_x| \hat e_z]$

$$ \lambda^2 = \det(C)^2 = \det(C^T)\det(C) = \det(C^TC) = \det\bigg(\begin{array}{c|c}A^T A & 0 \\\hline 0 & I_2\end{array}\bigg) = \det(A^T A) $$