Unable to understand this multiple select question from linear algebra

57 Views Asked by At

We have a vector $v$ which belongs to the $k$ tuple. We define the following matrix:

$$P=I-2\frac{vv^T}{v^Tv}$$

Here, $I$, is a $k×k$ identity matrix. Then, which of the following options is or are will be correct:

  1. Inverse of $P$ is equal to $I-P$.
  2. $-1$ and $1$ will be two eigenvalues of matrix.
  3. Inverse of $P$ is equal to $P$.
  4. $(1+P)v=v$.

Any initial hint about the what matrix $P$ represents will be appreciated. I can choose the correct options once I know what this $P$ Matrix represents. I am not able to fathom it. Please help. Thanks in advance.

3

There are 3 best solutions below

0
On

$P(I-P)=(I-2\frac{v v^T}{v^T v})(2\frac{v v^T}{v^T v})=2\frac{v v^T}{v^T v}-4\frac{v v^Tv v^T}{v^T v v^T v}=2\frac{v v^T}{v^T v}-4\frac{v v^T}{v^T v}=-2\frac{v v^T}{v^T v}...$

1
On

I believe your problem could be solved without "knowing" how the $P$ matrix looks like. For example, you can check whether $(I-P)P = I$ or not.
For eigenvalues, ask yourself if it is possible to find a vector $v \ne 0$ such that $$Pv = \lambda v$$ with $\lambda = \pm 1$ .

2
On

There is a way to think about this problem geometrically which makes the solution immediate.

Let $v \cdot w$ be the usual dot product on $\mathbb R^n$ (same argument will work for $k^n$). Notice for column vectors $$v = \begin{pmatrix} v_1 \\ \vdots \\ v_n \end{pmatrix}, w = \begin{pmatrix} w_1 \\ \vdots \\ w_n\end{pmatrix}$$ we have

$$v^Tw = \begin{pmatrix} v_1 & \cdots & v_n \end{pmatrix} \begin{pmatrix} w_1 \\ \vdots \\ w_n \end{pmatrix}= v \cdot w$$

Then, since matrix multiplication is associative, we have $(vv^T)w = v(v^Tw) = (v \cdot w)v$. And therefore, $P$ can be interpreted as the linear transformation $T$ given by

$$T(w) = w - 2\frac{v \cdot w}{v \cdot v}v$$

If you define $E$ to be the orthogonal complement of $v$ in $\mathbb R^n$, then $\mathbb R^n$ is the direct sum of $E$ and $\mathbb Rv$, and you can easily check that $T$ is the unique linear transformation which fixes $E$ pointwise and sends $v$ to $-v$.

Therefore, if $v_2, ... , v_n$ is a basis for $E$, and we set $v_1 = v$, then $v_1, ... , v_n$ is a basis for $\mathbb R^n$, and the matrix of $T$ with respect to this basis is

$$Q =\begin{pmatrix} -1 \\ & 1 \\ & & \ddots \\ & & & 1 \end{pmatrix}$$

Thus $Q = XPX^{-1}$ for some invertible matrix $X$, and you can deduce which properties are true by looking at $Q$ instead of $P$.

The kind of linear transformation you have encountered is called a reflection. It fixes a hyperplane, and sends a vector orthogonal to the hyperplane to its negative.