Assume $A, B$ being a pair of symmetric matrices over reals. Let $$ \varphi_1(x) = (x, Ax)\\ \varphi_2(x) = (x, Bx). $$ There's a well-known result that if $A > 0$ then the pair of forms can be brought by the same transform to $$ \varphi_1 = \sum_{i=1}^n z_i^2\\ \varphi_2 = \sum_{i=1}^n \lambda_i z_i^2. $$ This corresponds to the following factorization of $A, B$: $$ A = SS^\top,\quad B = SDS^\top\\ D = \operatorname{diag} \lambda_i. $$ To obtain this form one can perform Cholesky decomposition of $A = LL^\top$ and eigenvalue decomposition of $L^{-1}BL^{-\top} = UDU^\top$ (note that $(L^{-1}BL^{-\top})^\top = L^{-1}B^\top L^{-\top} = L^{-1}B L^{-\top}$). Thus $$ A = LL^\top = LUU^\top L^\top = (LU)(LU)^\top\\ B = LUDU^\top L^\top = (LU)D(LU)^\top\\ S = LU. $$
I wonder if similar decomposition $$ A = SD_1S^\top\\ B = SD_2S^\top $$ can be obtained without the assumption $A > 0$. I tried LDL decomposition for $A$ instead of Cholesky, but that does not seem to work.
Update. It seems that no such decomposition is possible for real-valued $S$, which actually surprises me. E.g. $$ A = \begin{pmatrix} 1 & 0 \\ 0 & -1 \end{pmatrix}\\ B = \begin{pmatrix} 0 & 1 \\ 1 & 0 \end{pmatrix}\\ S^{-1} = \begin{pmatrix} s_{11} & s_{12} \\ s_{21} & s_{22} \end{pmatrix}\\ S^{-1}AS^{-\top} = \begin{pmatrix} s_{11}^2 - s_{12}^2 & s_{11}s_{21}-s_{12}s_{22} \\ s_{11}s_{21}-s_{12}s_{22} & s_{21}^2 -s_{22}^2 \end{pmatrix}\\ S^{-1}BS^{-\top} = \begin{pmatrix} 2s_{11}s_{12} & s_{12}s_{21}+s_{11}s_{22} \\ s_{12}s_{21}+s_{11}s_{22} & 2s_{21}s_{22} \end{pmatrix}\\ s_{11}s_{21}-s_{12}s_{22} = 0\\ s_{12}s_{21}+s_{11}s_{22} = 0\\ s_{12}s_{21}-s_{11}s_{22} \neq 0 $$ Thus $$ s_{11}s_{21} = s_{12}s_{22} \implies s_{11}s_{12}s_{21} = s_{12}^2s_{22}\\ s_{12}s_{21} = -s_{11}s_{22} \implies s_{11}s_{12}s_{21} = -s_{11}^2s_{22}. $$ This system does not have a real-valued solution. I wonder if it can be shown for general $A = A^\top, B = B^\top$.
Daniel gave a good answer. In particular, let $A=diag(I_p,-I_p),B=[b_{i,j}]$, where the $b_{i\leq 2p,j\leq 2p}$ are $0$ except, for every $i$, $b_{i,2p-i+1}=1$. For every real $\lambda$, $A+\lambda B$ is invertibe and, consequently, $S$ does not exist.
Yet, the converse is false. Indeed, let $A=diag(1,1,-1),B=\begin{pmatrix}0&0&1\\0&1&0\\1&0&0\end{pmatrix}$; then $A-B$ is singular and (using Maple) there is no real invertible $S$ s.t. $SAS^T,SBS^T$ are diagonal.
EDIT. There is an interesting result due to E. Pesonen and J. Milnor and one due to Greub and O. Taussky.
Proposition 1. If $n\geq 3$ and $(x\in \mathbb{R}^n, x^TAx=x^TBx=0)$ implies $x=0$, then there is a real invertible $S$ s.t. $SAS^T,SBS^T$ are diagonal.
Proof. According to Finsler's theorem, there are real $s,t$ s.t $sA+tB>0$ and we reduced the problem to the standard one.
Proposition 2. If $n\geq 3$ and $A$ is invertible, then there is a real invertible $S$ s.t. $SAS^T,SBS^T$ are diagonal IFF $A^{-1}B$ is similar to a real diagonal matrix IFF $A^{-1}B=P^{-1}T$ where $P,T$ are symmetric and $P>0$.