Let $\bf A$ be a real symmetric matrix of order $n\times n$ such that $\mathrm R(\mathbf A)=r(\le n)$. Then show that there exists a nonsingular matrix $\bf F$ such that $\bf F'AF=\begin{pmatrix}\bf I & \bf0 & \bf0 \\\bf0 & \bf-I & \bf0 \\\bf0 & \bf0 & \bf0 \\\end{pmatrix}$ where the orders of $\bf I$ and $\bf-I$ are respectively the number of positive and negative characteristic roots of $\bf A$, the sum of the orders being $r$. $[\mathrm R(\mathbf A)$ denotes the rank of $\bf A]$
(Attempt using hints):
Let $\mathbf A$ be nonsingular. Then as $\bf A$ is symmetric, by the Spectral decomposition theorem $\exists$ an orthogonal matrix $\bf P$ such that $\mathbf{P'AP}=\mathrm{diag}(\lambda_1,\lambda_2,...,\lambda_n)$ where $\lambda_1\ge...\lambda_q>0>\lambda_{q+1}\ge...\ge\lambda_n$ are the characteristic roots of $\bf A$.
Define $\mathbf Q=\mathrm{diag}\left(\frac{1}{\sqrt{\lambda_1}},...,\frac{1}{\sqrt{\lambda_q}},\frac{1}{\sqrt{-\lambda_{q+1}}},...,\frac{1}{\sqrt{-\lambda_n}}\right)$.
Then $\det(\mathbf Q)=\prod_{i=1}^rq_{ii}\ne0$ as the diagonal entries $q_{ii}$ of $\bf Q$ are non-zero. So $\bf Q$ is nonsingular.
So, $\mathbf{Q'(P'AP)Q}=\mathbf{(PQ)'A(PQ)}=\begin{pmatrix}\bf I_q & \bf0 \\\bf0 & \bf-I_{n-q} \\\end{pmatrix}$.
Putting $\mathbf{PQ}=\bf F$, we can say that $\bf F$ is nonsingular being the product of two nonsingular matrices.
But I am not able to use a similar argument for the general case when $\bf A$ is singular.
In this case we have $\mathbf{P'AP}=\mathrm{diag}(\lambda_1,\lambda_2,...,\lambda_r,0,...,0)$ where $\lambda_1\ge...\lambda_q>0>\lambda_{q+1}\ge...\ge\lambda_r$ are the non-zero eigenvalues of $\bf A$. But I cannot define $\bf Q$ in a similar manner as in the previous case so that $\mathbf{Q'(P'AP)Q}$ becomes $\mathrm{diag}(\mathbf I_q,\mathbf{-I_{r-q}},\mathbf0)$.
Is there a simpler approach for proving the result? A reference for the general proof would be great.
I have some further doubts regarding this result.
Let $\mathrm{R}(\mathbf A)=r(\le n)$. Then by the nonsingular transformation $\mathbf x\mapsto \mathbf y$ such that $\mathbf x=\mathbf{Py}$ $(\bf P$ is the same orthogonal matrix as before$)$, the quadratic form $\mathbf{x^\top Ax}$ is transformed to $\mathbf{y^\top (P'AP)y}=\sum_{i=1}^r \lambda_i y_i^2$, where $\mathbf y=(y_1,y_1,...,y_n)'$. This way we prove that any real quadratic form is diagonalisable.
In general we say that the quadratic form can be transformed to $\sum_{i=1}^r d_i y_i^2$ where $d_i>0$ when $\bf A$ is p.d. with full rank, and $d_i>0$ for $i=1,...,r$; $d_i=0$ for $i=r+1,...,n$ when $\bf A$ is p.s.d. with rank $r(<n)$.
But what are the $d_i$'s actually? Are they always the eigenvalues of $\bf A$? What is its connection with the matrix $\mathrm{diag}(\mathbf I,\mathbf{-I},\mathbf0)$? Does $d_i\in \{0,1,-1\}$ $\forall i=1,2,...,r$ ?
EDIT.
It appears that I am having trouble grasping the concept correctly.
Let $\mathbf D=\mathrm{diag}(d_1,...,d_r,0,...,0)$ be the diagonal form into which $\bf A$ is transformed.
Then $Q(\mathbf x)=\mathbf{x^\top Ax}$ is transformed to some $Q(\mathbf y)=\mathbf{y^\top Dy}=\sum_{i=1}^rd_iy_i^2$
So, if the $d_i$'s are changed to $d_i=\begin{cases}1, & \text{for } i=1,2,...,q \\-1, & \text{for } i=q+1,q+2...,r \\0, & \text{for } i=r+1,...,n\end{cases}$
then I indeed get the diagonal matrix $\mathbf D=\mathrm{diag}(\mathbf I,\mathbf{-I},\mathbf0)$ and the quadratic form $Q(\mathbf x)$ is transformed to $\mathbf{y^\top Dy}=y_1^2+...+y_q^2-y_{q+1}^2-...-y_r^2$.
My question then boils down to
how can I change the $d_i$'s in the above manner? What is the justification?
If $A$ is a real and symmetric matrix, $A$ is Hermitian and hence diagonalisable with real eigenvalues. Congruence transformations include any basis change that effects a diagonalisation of $A$, but also those that subsequently scale its eigenvalues by positive factors. We can then set positive diagonal elements to $1$ and negative ones to $-1$ while preserving those that are zero. (In your notation we need to choose the diagonal entries of $Q$ as you have for non-zero eigenvalues, but our choices for zero eigenvalues are irrelevant (as long as they are non-zero), so we may as well set these entries in $Q$ to $1$.) Finally, a rearrangement of rows and columns is also possible, thus collecting the $+1$ entries together followed by the $-1$ entries followed by the zero entries, giving the required result.