Proving an optimization problem has a rational optimum.

73 Views Asked by At

Consider the function $$ J_\gamma(X) = \det\left( I - \tfrac{1}{\gamma^2} (A+BXC)^\mathsf{T}(A+BXC)\right) $$ where $A$, $B$, $C$, $X$ are matrices of real numbers. Further suppose that $B^\mathsf{T}B$ and $CC^\mathsf{T}$ are invertible (this guarantees that $J_\gamma$ is one-to-one). It is known that for $\gamma$ sufficiently large, $J_\gamma(X)$ is a concave function of $X$. Let's assume from now on that $\gamma$ is so chosen.

I have been studying the solution to the following optimization problem. $$ X_\text{opt} = \arg\max_{X} J_\gamma(X) $$ Note that in general, $X_\text{opt}$ is some function of $A$, $B$, $C$, and $\gamma$. It turns out (I don't how to prove this) that $X_\text{opt}$ is actually a rational function of $A$, $B$, $C$, and $\gamma$. By this, I mean that the optimal $X$ is a matrix whose entries are rational functions of $\gamma$ and the entries of $A$, $B$, and $C$.

Now let's make things more interesting. Let's assume $X$ is block-triangular, so $$ X = \left[\begin{array}{cc} X_{11} & 0 \\ X_{21} & X_{22} \end{array}\right] $$ After some experimenting with Mathematica, I concluded that the optimal $X$ (subject to this sparsity constraint) is again a rational function of $A$, $B$, $C$, and $\gamma$... Finally, I tried $$ X = \left[\begin{array}{cc} X_{11} & 0 \\ 0 & X_{22} \end{array}\right] $$ and now it's no longer true... the optimal $X$ is no longer a rational function. Something deeper must be going on here, but I'm not sure what it is. Also any ideas on how I might go about proving any of the things I stated above would be helpful!