Given two Hermitian, positive definite matrices $\mathbf{T_1}$ and $\mathbf{T_2}$, is it generally true that the matrix $\mathbf{G}_{ij} = \lvert\langle \mathbf{u}_i \mathbf{v}_j \rangle\rvert^2$ of the squared Euclidean inner products of the normalized eigenvectors of $\mathbf{T_1}$ and $\mathbf{T_2}$ is unique? I'm especially worried about the case of repeated eigenvalues, because I think in that case, the eigenbasis is not unique.
Edit: For the application I'm interested in, I only need to consider the absolute value of the inner product, $\lvert\langle \mathbf{u}_i \mathbf{v}_j \rangle\rvert$
Attempt at a solution: Suppose that both $\mathbf{T_1}$ and $\mathbf{T_2}$ have n distinct eigenvalues: $\lambda_1 > \lambda_2 .... > \lambda_n$. In this case, the eigenbases are unique up to multiplications with complex numbers on the unit circle. This implies that the modulus of the inner products is unique.
The problem: What happens when one of the matrices has repated eigenvalues? In this case, I have a whole eigenspace for that eigenvalue and any two vectors that form a basis for that eigenspace are valid eigenvectors. What happens in this case with the inner product with the eigenvectors of the other matrix? Is it still unique?
Another Way of Looking at The Problem: Let $\mathbf{U}$ be the matrix whose column vectors represent the eigenvectors of the positive definite, hermitian matrix $\mathbf{T_1}$. Now, lets use the matrix $\mathbf{U}$ as a unitary transformation on the matrix $\mathbf{T_2}$:
$\mathbf{\tilde{M}} = \mathbf{U}^{H}\mathbf{T_2}\mathbf{U}$
Now, for a certain application, I am interested on the diagonal elements $\tilde{\boldsymbol\mu}$ of this matrix $\mathbf{\tilde{M}}$ in relation to the diagonal elements of:
$\boldsymbol\Lambda = \mathbf{U}^{H}\mathbf{T_1}\mathbf{U}$
that is, the diagonal matrix of the eigenvalues of $\mathbf{T_1}$, which I will represent with the vector $\boldsymbol\lambda$. More precisely, I want to compute the inner product between those two vectors of diagonal elements, normalized by the norm of the true eigenvalue vectors: $\gamma = \frac{\boldsymbol\lambda\cdot \tilde{\boldsymbol\mu} }{\lvert\lvert\boldsymbol\lambda\rvert\rvert\lvert\lvert\boldsymbol\mu\rvert\rvert}$
Now the problem is: is the inner product unique? I need it to be unique for the applications I am interested in to be always valid.
Relation Between the Two Interpetations: After a rather tedious computation, I could express the inner product of the elements of the diagonals $\gamma$ as a bilinear form of the vectors of the eigenvalues of $\mathbf{T_1}$ and $\mathbf{T_2}$ using the matrix $\mathbf{G}$ described above:
$\gamma = \frac{\boldsymbol\lambda^{H}\mathbf{G}\boldsymbol\mu}{\lvert\lvert\boldsymbol\lambda\rvert\rvert\lvert\lvert\boldsymbol\mu\rvert\rvert}$ where $\boldsymbol\lambda$ and $\boldsymbol\mu$ represent the vectors of eigenvalues of $\mathbf{T_1}$ and $\mathbf{T_2}$ Now, if I could prove that $\mathbf{G}$ is unique, my problem would be solved, because the eigenvalues are unique.
Additional Question: If the matirx $\mathbf{G}$ is not unique for matrices with repeated eigenvalues, how dense is the set of positive definite hermitian matrices with repeated eigenvalues compared to the set of all positive definite hermitian matrices? In that way, I can at least argue that the case of repeated eigenvalues is infrequent enough that I do not need to worry about it.
When $T_1$ or $T_2$ have repeated eigenvalues, the eigenvectors are not unique up to scalar multiplication. So, you cannot expect that your $G$ is unique even if the eigenbases are required to be orthonormal. It is very easy to construct an example with multiple $G$s. For instance, consider $T_1=\operatorname{diag}(1,0)$ and $T_2=I_2$. Take $u_1=(1,0)^T$ and $u_2=(0,1)^T$. Now, \begin{cases} v_1=(1,0)^T,\ v_2=(0,1)^T\ \Rightarrow\ G=I_2,\\ v_1=(\tfrac1{\sqrt{2}},\tfrac1{\sqrt{2}})^T,\ v_2=(\tfrac1{\sqrt{2}},-\tfrac1{\sqrt{2}})^T\ \Rightarrow\ G=\frac12\pmatrix{1&1\\ 1&1}. \end{cases}