Need some clarification in the definition of "Generalised inverse."

236 Views Asked by At

Statement*: $A^+y=x $ if $Ax=y, x \in R(A^T)$ and $A^+y=0 $ if $ y \in N(A^T)$ where $A^+$ means Moore Penrose inverse (Generalised inverse), $R(A)$ is the column space of rectangular matrix $A \in \mathbb{R}^{m \times n}.$

The matrix $A \in \mathbb{R}^{m \times n}$ always in my whole question.

My understanding;

$Ax=y$ or equivalently $A :\mathbb{R}^{n} \to \mathbb{R}^{m} ;$ where $y \in R(A),$ $x \in \mathbb{R}^{n}$

Observation 1: In general, $Ax=y$ then $A$ transformed each vector in the row space or null space into the column space, But this mapping is one-one only if $ker(A)=0$ or $A$ is of full column rank matrix. (Is it correct?)

Observation 2: But in the given statement* can we say if $x$ is in the row space of $A$ then $A^+$ mapped the each vector of column space to the row space uniquely. (More accurately $A^+$ is a inverse transformation (of $A$) which mapped each column vector into the row vector uniquely ) and each vector in the left null space mapped to the zero vector (of the domain set).

I'm just start learning linear algebra from basis. please help in the above observations.

Thank you.

1

There are 1 best solutions below

4
On

Are these implication related to these basis concepts of generalized inverse of a matrix or correct, if not please suggestion/hint. Thank you.

Let for $A \in \mathbb{R}^{m \times n};$

Note: In stranded conventions of SVD, $V$ contains basis of row space and null space after rationalization, $U$ conatins basis of column space and left null space vectors after orthogonalization of the matrix $A.$

Let $A=U\Sigma V^T$ where $U,V$ are orthogonal operators.

$Ax=b \iff (U\Sigma V^T) x=b$; $b \in col(A)$ and $x \in Row(A) \cup Null(A)=\mathbb{R}^n.$

Using SVD; $A^+=V\Sigma^+ U^T$

So, $A^+y=x \implies A^+y=V\Sigma^+ U^Ty=x$ Hence can we justifies Statement* like $\implies A^+ $ maps basis of column space into basis of row space (the analogy is like any inverse function.)

and $A^+y=0 \implies y \in N(A^T);$ can we say $A^+$ maps the whole left null space(i.e.$N(A^T)$) of $A$ into the $N(A)$.

2nd version: we know $A:\mathbb{R}^{n} \to \mathbb{R}^{m}$

If $Col(A) \subseteq \mathbb{R}^{m} $ from $Ax=y ; y \in Col(A) \implies A^+Ax=A^+y \implies x=A^+y$ Since $A^+A$ is projection matrix on the row space of $A$. So Clearly $A^+$ acts as a inverse function and transformed column space into row space.

$A^+y=x$ if $Ax=y, x \in R(A^T).$

and the left null space to zero space via $A^+.$

$A^+y=0$ if $y \in N(A^T).$

I think this is correct explanation of the given question.

Any suggestions is highly appreciated.