I met the following problem during my research, could anyone please give me some help?
Let $A(x) \in \mathbb{R}^{2 \times 2}, x\in \mathbb{R}$ be a symmetric matrix in the form \begin{align} A(x) = \left[ \begin{array}{cc} a_{1,1}(x) & a_{1, 2} (x)\\ a_{2,1}(x) & a_{2, 2} (x) \end{array}\right], \quad \text{where} \quad a_{1, 2}(x) = a_{2, 1}(x), \end{align} and $D \in \mathbb{R}^{N \times N}$ is a positive semi-definite matrix in the form \begin{align*} D = \left[ \begin{array} {ccccc} 2 & -1 & & & -1\\ -1 & 2 & -1 & & \\ & \ddots & \ddots & \ddots & \\ & & -1 & 2 & -1 \\ -1 & & & -1 & 2 \end{array}\right]. \end{align*} Here, all unspecified entries of $D$ are zeros. $D$ has non-zero entries in the three main diagonal lines, and $D(1, N) = D(N, 1) = -1$.
Assume $A(x)$ is positive semi-definite for all $x \geq 0$, that is the eigenvalue functions of $A(x)$ satisfy $\lambda_i(x) \geq 0$. Is it true that $A(D)$ in the following form \begin{align} A(D) = \left[ \begin{array}{cc} a_{1,1}(D) & a_{1, 2} (D)\\ a_{2,1}(D) & a_{2, 2} (D) \end{array}\right], \end{align} also has non-negative eigenvalues? How to prove it?
For example, \begin{align} A(x) = \left[ \begin{array}{cc} \frac{1}{2} x & x\\ x& 2x \end{array}\right],\quad A(D) = \left[ \begin{array}{cc} \frac{1}{2} D & D\\ D& 2D \end{array}\right]. \end{align} $A(x)$ is positive definiteness for $x\geq 0$. Assume the eigenvlues of $A(x)$ are $\lambda_1(x)$ and $\lambda_2(x)$, and eigenvalues of $D$ are $\mu_k$, how to compute the eigenvalues of $A(D)$? Is it true that the eigenvalue are $A(D)$ are $\lambda_i(\mu_k), i = 1, 2$?
EDIT1: For a general symmetric $A(x) \in \mathbb{R}^{s\times s}$ and $D \in \mathbb{R}^{N \times N}$, assume $\lambda_i(x)$ are eigenvalue functions of $A(x)$, and denote $\Lambda(x) = \mathrm{diag}([\lambda_1(x), \dots, \lambda_{s}(x)])$. Then it holds that \begin{align*} A(x) & = P(x) \Lambda(x) P^{-1}(x), \\ A(D) & = P(D) \Lambda(D) P^{-1}(D). \end{align*} Let $\Gamma = \mathrm{diag}([\gamma_1, \dots, \gamma_{N}])$ be the eigenvalues of the symmetric matrix $D \in \mathbb{R}^{N \times N}$, then we can write $D = F \Gamma F^{-1}$. Let $I_s$, and $I_N$ be the indentity matrices in $\mathbb{R}^{s \times s}$ and $\mathbb{R}^{N \times N}$, respectively. It holds that \begin{align*} A(D) & = P(D) \Lambda(F \Gamma F^{-1}) P^{-1}(D) \\ & = P(D) (I_s \otimes F) \Lambda(\Gamma) (I_s \otimes F^{-1}) P^{-1}(D) \end{align*} The eigenvalues of $A(D)$ are $\Lambda(\Gamma)$, that is $\lambda_i(\gamma_j), i = 1, \dots, s; j = 1, \dots, N$.
Are there mistakes in the above result?
Thank you very much!