Suppose that $T$ is a symmetric and invertible matrix. The following must be true
\begin{equation}\label{eq:SDP}\left[\begin{array}{cc} X & A \\ A^{\top} & B \end{array}\right] \succeq 0 \Leftrightarrow\left[\begin{array}{cc} X & A T \\ T^{\top} A^{\top} & T B T^{\top} \end{array}\right] \succeq 0\end{equation}
I have an SDP constraint of the form on the left. Empirically, I have observed that some matrices $T$ lead to faster convergence in the semi-definite program then the trivial case when $T = I$, where $I$ is an identity matrix.
My question is: Is it possible to systematically determine a matrix $T$ which will lead to faster convergence of the SDP ?
I am guessing that pre and post multiplying the matrix changes its condition number. However, I can't see how that relates to faster convergence of SDP.
\begin{equation}\left[\begin{array}{cc} I & 0 \\ 0 & T \end{array}\right]\left[\begin{array}{ll} X & A \\ A^{\top} & B \end{array}\right]\left[\begin{array}{ll} I & 0 \\ 0 & T \end{array}\right]\end{equation}