Closed-form solution to maximization of rational function with linear and bilinear form

75 Views Asked by At

I have the function: $$f(\mathbf{x}) = \frac{\mathbf{x}^\mathrm{H}\mathbf{w} + \mathbf{w}^\mathrm{H}\mathbf{x}}{c + \mathbf{x}^\mathrm{H}\mathbf{A}\mathbf{x}},$$ where $\mathbf{x},\mathbf{w}\in\mathbb{C}^{N\times 1}$, $\mathbf{A}$ is hermitian matrix of corresponding size, and $c$ is a real constant.

I would like to solve the problem $$\mathbf{x}^*=\arg\max_{\mathbf{x}} f(\mathbf{x}).$$

Is there please any closed-form solution to this problem?

2

There are 2 best solutions below

2
On BEST ANSWER

The general idea of this solution is keeping constant the denominator while maximizing the numerator.

First, we know that since $A$ is Hermitian, then according to Schur decomposition, there exists another matrix $B$ such that $A=BB^H$. Replacement in $f({\bf x})$ gives $$ f({\bf x})=\frac{{\bf x}^H{\bf w}+{\bf w}^H{\bf x}}{c+{\bf x}^HBB^H{\bf x}}. $$ We have either $A$ (and similarly $B$) are invertible or not. In case $A$ and $B$ are invertible, by defining ${\bf y}=B^H{\bf x}$ and ${\bf v}=(B^H)^{-1}{\bf w}$, we obtain $$ f({\bf x})=\frac{{\bf y}^H{\bf v}+{\bf v}^H{\bf y}}{c+{\bf y}^H{\bf y}}. $$ Since an arbitrary vector is a multiplication of a real non-negative constant in a unit vector, an equivalent form of the maximization problem is then $$ \max_{\bf y}\frac{{\bf y}^H{\bf v}+{\bf v}^H{\bf y}}{c+{\bf y}^H{\bf y}} {= \max_{k\ge 0}\max_{{\bf u},|{\bf u}|_2=1}\frac{k{\bf u}^H{\bf v}+k{\bf v}^H{\bf u}}{c+(k{\bf u})^H(k{\bf u})} \\= \max_{k\ge 0}\max_{{\bf u},|{\bf u}|_2=1}k\frac{{\bf u}^H{\bf v}+{\bf v}^H{\bf u}}{c+k^2({\bf u})^H({\bf u})} \\= \max_{k\ge 0}\max_{{\bf u},|{\bf u}|_2=1}\frac{k}{c+k^2}({\bf u}^H{\bf v}+{\bf v}^H{\bf u}) \\= \frac{1}{2\sqrt c}\cdot \max_{{\bf u},|{\bf u}|_2=1}({\bf u}^H{\bf v}+{\bf v}^H{\bf u}), } $$ where we have assume $c>0$, since for $c\le 0$, the maximum is trivially $\infty$. The solution to $\max_{{\bf u},|{\bf u}|_2=1}({\bf u}^H{\bf v}+{\bf v}^H{\bf u})$ is $\bf u^*=\begin{bmatrix}0&\cdots&0&\underbrace{\frac{v_{i^*}}{|v_{i^*}|}}_{\text{$i^*$-th position}}&0&\cdots&0\end{bmatrix}$ where $i^*=\text{argmax}_i |v_i|$. Therefore, the final solution is $$ \begin{cases} \infty&,\quad c\le 0\\ \frac{\max_i|v_i|}{\sqrt c}&,\quad c>0 \end{cases}. $$ In case $A$ is not invertible, the aforementioned Schur decomposition yields $A=B\hat I_{k}B^H$, where $k$ is the rank of $A$ and $\hat I_k$ is a diagonal matrix, where the first $k$ diagonal entries are $1$ and other entries are $0$. Then, by defining ${\bf y}=B_k^H{\bf x}$, where $B_k^H$ is the leading principal submatrix of $B$ for number $k$ (i.e. a submatrix consisting of only the first $k$ rows and columns of $B^H$), we can apply the same method as before similarly (albeit, with slight changes).

0
On

$ \def\a{\alpha} \def\b{\beta} \def\s{\sigma} \def\A{A^{-1}} \def\bv#1{\b^{-#1}} \def\fv#1{f^{-#1}} \def\LR#1{\left(#1\right)} \def\op#1{\operatorname{#1}} \def\trace#1{\op{Tr}\LR{#1}} \def\frob#1{\left\| #1 \right\|_F} \def\qiq{\quad\implies\quad} \def\p{\partial} \def\grad#1#2{\frac{\p #1}{\p #2}} $The colon product $(:)$ is quite useful in Matrix Calculus $$\eqalign{ A:B &= \sum_{i=1}^m\sum_{j=1}^n A_{ij}B_{ij} \;=\; \trace{A^TB} \\ C^*:C &= \frob{C}^2 \qquad \{ {\rm Frobenius\;norm} \}\\ A:B &= B:A \;=\; B^T:A^T \\ \LR{AB}:C &= A:\LR{CB^T} \;=\; B:\LR{A^TC} \\ }$$ Consider the numerator and denominator separately $$\eqalign{ \a &= w^*:x + w:x^* &\qiq d\a = w:dx^* \\ \b &= c + {Ax}:x^* &\qiq d\b = Ax:dx^* \\ }$$ where differentiation has been carried out with respect to $x^*$ only (i.e. in the Writinger sense) while $x$ is treated as an independent variable and held constant.

Use these results to differentiate the whole function $$\eqalign{ f &= \a\bv1 \\ df &= d\a\:\bv1 - a\bv2\:d\b \\ &= \bv1\LR{d\a - f\,d\b} \\ &= \bv1\LR{w - fAx}:dx^* \\ \grad{f}{x^*} &= \bv1\LR{w - fAx} \\ }$$ Set the gradient to zero and solve for the optimal vector $$ x = \fv1\A w $$ This gives us the direction of the vector.
Substitute this back into the function to find its magnitude $$\eqalign{ \s &\equiv \,{w^H\A w} \;=\; w^*:\A w \\ \a &= \LR{w^*:x + w:x^*} = 2\fv1\s \\ \b &= c + \fv2\s \\ f &= \frac{2\fv1\s}{c + \fv2\s} \\ f^2 &= \frac{2\s}{c + \fv2\s} \\ \s + cf^2 &= 2\s \\ f^2 &= \frac{\s}{c} \\ }$$ Since $f$ must be $\sf Real$, the last line must be positive. Taking the square root yields $$\eqalign{ f &= \sqrt{\frac{\s}{c}} \\ }$$ Substituting this into the previous result fully specifies the optimal vector $$ x = \sqrt{\frac{c}{\s}}\;\A w $$