Consider two real matrices $\boldsymbol{H}$ and $\boldsymbol{D}$ with the following properties:
- $\boldsymbol{H}$ is a symmetric matrix (since it is a real matrix this is equivalent to being Hermitian).
- $\boldsymbol{H}$ is strictly positive definite (all eigenvalues are strictly greater than zero).
- $\boldsymbol{D}$ is a diagonal matrix with strictly positive diagonal entries.
Question: I am trying to find out whether there exists a closed form expression for the eigenvectors of $\boldsymbol{DHD}$ in terms of the eigenvectors and/or eigenvalues of $\boldsymbol{H}$. (Just to clarify: I am not interested in the matrix $\boldsymbol{DHD}^{-1}$.)
Comment: the matrix $\boldsymbol{H}$ is real symmetric and therefore it is diagonalizable so that there exists a matrix $\boldsymbol{S}$ with $\boldsymbol{H}=\boldsymbol{S\Lambda S}^{-1}$ for some diagonal matrix $\boldsymbol{\Lambda}$. The eigenvectors of $\boldsymbol{H}$ are therefore simply the columns of $\boldsymbol{S}$. Next, the matrix $\boldsymbol{DHD}$ is also diagonalizable (since it is also real symmetric) so that $\boldsymbol{DHD}=\boldsymbol{S_2\Lambda_2 S_2}^{-1}$. I was wondering if it is possible to relate $\boldsymbol{S}$ to $\boldsymbol{S}_2$ via some closed form expression.
Update
The simplified explanation of the reason I am interested in this problem is as follows:
I have a vector $f(\boldsymbol{S})\boldsymbol{\mu}$ which is the mean of a random variable ($f(\cdot)$ is a known function) while a separate random variable has mean $f_2(\boldsymbol{S_2})\boldsymbol{\mu}$ ($f_2(\cdot)$ is another known function). Right now, I am looking for the ideal situation in which I can express one in terms of the other so that someone who knows $\boldsymbol{D}$, $\boldsymbol{S}$ and $\boldsymbol{\Lambda}$ can easily infer $\boldsymbol{S_2}$. Additionally, I'm also searching for expressions on how the individual entries of the two mean vectors relate to each other (I would like to look at the entry-wise ratios of the two mean vectors). (This may be too hopeful.)
Indeed, there may be an easier way to tackle my original problem due to probabilistic properties which I've omitted for simplicity. However, I think a question including the entire problem would lack specificity; this is simply one of the current approaches I'm pursuing.
There can't be a closed form expression here (for any meaning of "closed form" that is weaker than roots of sextic polynomials). For example, try
$$ H = \pmatrix{3 & 1 & 0 & 0 & 0 & 1\cr 1 & 3 & 1 & 0 & 0 & 0\cr 0 & 1 & 3 & 1 & 0 & 0\cr 0 & 0 & 1 & 3 & 1 & 0\cr 0 & 0 & 0 & 1 & 3 & 1\cr 1 & 0 & 0 & 0 & 1 & 3\cr},\ D = \text{diag}(1,2,3,4,5,6)$$
Then the nice Toeplitz matrix $H$ has eigenvalues $1,2,2,4,4,5$, with nice eigenvectors having entries of $\pm 1$ and $0$. But $D H D$ has characteristic polynomial $${\lambda}^{6}-273\,{\lambda}^{5}+25507\,{\lambda}^{4}-1015971\,{ \lambda}^{3}+17394856\,{\lambda}^{2}-111331584\,\lambda+165888000 $$ which is irreducible over the rationals and has Galois group $S_6$, thus its roots can't be expressed in radicals. Of course the eigenvalues can be obtained from the eigenvectors. So somehow your "closed form" will have to be able to take integer inputs and come up with the roots of an unsolvable sextic polynomial.