Two matrices with a bounded difference - convergence of their free energies.

34 Views Asked by At

I have two real, symmetric $2^{n} \times 2^{n}$ matrices $A$ and $B$. I know that ${\rm Max}(|A_{ij}|) = \mathcal{O}(n)$ and ${\rm Max}(|B_{ij}|) = \mathcal{O}(n)$ - i.e the absolute value of their largest elements scales linearly with the matrix dimension $n$. I also know that, in a certain basis, the difference matrix $D = A - B$ is diagonal and ${\rm Max}(|D_{ij}|) = \mathcal{O}(\sqrt{n})$.

I now wish to prove the following about the difference in their `free energies'

$\frac{1}{n}{\rm ln}\left(\frac{{\rm Tr}\left(\exp(\lambda A)\right)}{{\rm Tr}\left(\exp(\lambda B)\right)}\right) = \mathcal{O}\left(\frac{1}{\sqrt{n}}\right) \implies \lim_{n \rightarrow \infty}\frac{1}{n}{\rm ln}\left(\frac{{\rm Tr}\left(\exp(\lambda A)\right)}{{\rm Tr}\left(\exp(\lambda B)\right)}\right) = 0 \qquad \lambda \in \mathbb{R}$.

I am wondering if anyone has any insight on how to prove this? I am fairly convinced it is true but any counterexample would also be helpful.

1

There are 1 best solutions below

0
On

I believe I can now answer my question.

Let us order the eigenvalues of $A,B$ and $D$ as $\lambda^{A/B/D}_{1}, \lambda^{A/B/D}_{2}, ..., \lambda^{A/B/D}_{2^{n}}$ with $\lambda^{A/B/D}_{i} \geq \lambda^{A/B/D}_{i-1}$.

By Weyl's inequality we have: $\lambda^{B}_{i} + \lambda^{D}_{1} \leq \lambda^{A}_{i} \leq \lambda^{B}_{i} + \lambda^{D}_{2^{n}}$. By definition, the maximum and minimum eigenvalue of $\hat{D}$ are bounded as $\mathcal{O}(\sqrt{n})$ and and so $\vert \lambda^{A}_{i} - \lambda^{B}_{i} \vert \leq \mathcal{O}(\sqrt{n}) \ \forall i$. We can therefore write the eigenvalues of $B$ as $\lambda^{B}_{i} = \lambda^{A}_{i} + c_{i}$ where $c_{i}$ is a number which can scale, at most, as $\mathcal{O}(\sqrt{n})$.

Substituting into the expression for the Free energy difference we get

\begin{equation} \Delta f = \frac{1}{n}\ln \left( \frac{{\rm Tr}(\exp(\lambda A))}{{\rm Tr}(\exp(\lambda B))} \right) = \frac{1}{n}\ln \left( \frac{\sum_{i=1}^{2^{n}}\exp(\beta(c_{i}))\exp(\beta(\lambda^{A}_{i}))}{\sum_{i=1}^{2^{n}}\exp(\beta(\lambda^{A}_{i}))} \right). \end{equation}

At this point I believe it is fairly clear that the ratio between the numerator and the denominator inside the logarithm can scale, at best, as $\exp(\mathcal{O}(\sqrt{n}))$. We therefore get

\begin{equation} \vert \Delta f \vert \leq \frac{1}{n}\ln \left(\exp(\mathcal{O}(\sqrt{n})) \right) = \mathcal{O}\left(\frac{1}{\sqrt{n}}\right) \end{equation}

and the proof is complete.

I would be greatful if anyone spots any errors.