I'm reading the book "Convex Analysis and Optimization" written by Prof. Bertsekas. In Example 2.2.1, there are the following description:

I don't know how to derive the equation 2.2. Could anyone help give a hint, please? Thanks!
UPDATE:
With the references kindly provided by KittyL, I have the following understanding:
The problem is to project the vector -c into the subspace $X=\{x|Ax=0\}$. Suppose the projection vector in the subspace is $x^*$, then because $x^*$ in the subspace, thus we have $Ax^*=0$. And the error vector is $-c-x^*$, which is perpendicular to the vectors in the subspace, thus we have $(-c-x^*)^Tx=0,\forall x\in X$.
The problem is also to project the vector -c into the null space of $A$. To this end, because the error vector $-c-x^*$ is in the column space of $A^T$ and $x^*$ is in the null space of $A$, and the projection matrix which projects the $-c$ into the the column space of $A^T$ is $A^T{(AA^T)}^{-1}A$ (from Reference 1). Thus, the projection matrix which projects the $-c$ into the null space of $A$ is $I-A^T{(AA^T)}^{-1}A$ (from Reference 2), and the projection vector that is in the null space of $A$ is $(I-A^T{(AA^T)}^{-1}A)(-c)$, which is also the vector $x^*$.
See this:
http://ocw.mit.edu/courses/mathematics/18-06sc-linear-algebra-fall-2011/least-squares-determinants-and-eigenvalues/projections-onto-subspaces/MIT18_06SCF11_Ses2.2sum.pdf
It shows you how to project a vector onto the span of $A$.
Then here:
http://ocw.mit.edu/courses/mathematics/18-06sc-linear-algebra-fall-2011/least-squares-determinants-and-eigenvalues/projection-matrices-and-least-squares/MIT18_06SCF11_Ses2.3sum.pdf
This shows you how to project a vector onto the null space of $A$.
What you need is to project $-c$ onto the null space of $A$.