For a given $A=\begin{bmatrix}a_1&&&\\&a_2&1&\\&&a_2&\\&&&a_2\end{bmatrix}$ and $B=\begin{bmatrix}b_{11}&b_{12}\\b_{21}&b_{22}\\b_{31}&b_{32}\\0&b_{42}\end{bmatrix}$, where $a_i>1$, $a_1\neq a_2$, $a_i$ and $b_{ij}$ are real,
\begin{array}{ll} \underset{X \in \mathbb{R}^{4\times 4}}{\text{minimize}} & \mathrm{tr} \left( B^T X B \right)\\ \text{subject to} & X=A^TX(I+BB^TX)^{-1}A.\end{array}
Here $X$ is the unique solution to DARE (discrete-time algebraic Riccati equation). For a specific $A$ and $B$, I can use Matlab to solve the DARE and then insert the resulting $X$ to find $\mathrm{tr}(B^TXB)$. Is it possible to get the general answer in terms of $A$ and $B$?
I have asked this question for general $A$ and $B$ here, but now trying to solve it for simpler case.
Currently I have the following bounds:
$$a_1^2a_2^4+a_2^2-2\geq \mathrm{tr}(B^TXB) \geq 2a_1a_2^3-2.$$
Matlab code to calculate the $\mathrm{tr}(B^TXB)$ when $A$ is fixed for various $B'$s:
A=[5 0 0 0; 0 2 1 0; 0 0 2 0; 0 0 0 2];
Q=zeros(4,4);
for i=1:100
B=2.*rand(4,2)-1*ones(4,2);
[X,K,L] = dare(A,B,Q);
t(i)=trace(B'*X*B);
end
$ \def\LR#1{\left(#1\right)} \def\BR#1{\Big(#1\Big)} \def\vc#1{\operatorname{vec}\LR{#1}} \def\Mat#1{\operatorname{Mat}\LR{#1}} \def\trace#1{\operatorname{tr}\LR{#1}} \def\qiq{\quad\implies\quad} \def\p{\partial} \def\grad#1#2{\frac{\p #1}{\p #2}} $Since $X$ is symmetric and positive definite, it has an inverse. For typing convenience, define the matrix variables $$\eqalign{ V &= A^{-1} \\ Y &= X^{-1} \,= Y^T \\ C &= BB^T = C^T \\ Q &= VC \\ M &= {A\otimes I-I\otimes V} \\ P &= \LR{I-M^+M} \,= P^T \\ }$$ and use them to transform the constraint $$\eqalign{ X &= A^TX(I+CX)^{-1}A \\ YV^TXV &= (I+CX)^{-1} \\ AYA^TX &= I + CX \\ A^TX &= XV + XQX \\ YA^T &= VY + Q \\ Q &= YA^T - VY \\ }$$ into a Sylvester equation for $Y=X^{-1}$ which can be vectorized and solved as a linear equation $$\eqalign{ \vc{Q} &= M\,\vc{Y} \\ \vc{Y} &= M^+\vc{Q} + P\,\vc{U} \\ }$$ where $M^+$ denotes the pseudoinverse of $M,\,$ and $U$ is an unconstrained matrix variable.
I'm getting tired of writing $\vc{}$ so let's define the vectors $$\eqalign{ x &= \vc{X},\quad &c = \vc{C},\quad &q = \vc{Q},\quad &u = \vc{U} \\ X &= \Mat{x},\quad &C = \Mat{c},\quad &Q = \Mat{q},\quad &U = \Mat{u} \\ y &= M^+q + Pu,\quad &dy = P\,du \\ }$$ Write the differential of $X$ in vector form $$\eqalign{ dX &= -X\,dY\,X \\ dx &= -\LR{X\otimes X}\,dy \;=\; -\LR{X\otimes X}P\,du \\ }$$
Now revisit the objective function and calculate its gradient with respect to $u$. $$\eqalign{ \phi &= \trace{CX} \\ &= c^Tx \\ d\phi &= c^Tdx \\ &= -c^T {\LR{X\otimes X}P\,du} \\ &= -du^T\;{P\LR{X\otimes X}c} \\ \grad{\phi}{u} &= -P\LR{X\otimes X}c \;\doteq\; G(u) \\ }$$ Solve for the value of $u$ which makes $G(u)=0$ using your favorite gradient descent algorithm.
Keep in mind that $$X \,=\, Y^{-1} \,=\, \BR{\Mat{M^+q+Pu}}^{-1}$$ so ultimately, everything is a function of the $u$ vector.
The gradient descent step will look like $$u_{k+1} = u_k - \lambda\,G(u_k)$$ where $k$ is the iteration counter and $\lambda$ is the step-length which will be dictated by the algorithm that you choose.