Maximizing the product of projections of a vector on another vectors

203 Views Asked by At

I want to get the $N\times1$ complex vector $\mathbf{x}$ which maximizes this real valued function

$f=\mathbf{x}^{H}\left (\mathbf{a}_{1} \mathbf{a}_{1}^{H}\mathbf{x}\mathbf{x}^{H}\mathbf{a}_{2} \mathbf{a}_{2}^{H} \right )\mathbf{x}$, subject to $||\mathbf{x}||=1$

where $\mathbf{a}_{1}$and $\mathbf{a}_{2}$ are $N\times1$ known complex vectors, and $(.)^{H}$ is the complex conjugate operator

Can any one help by closed form solution or an iterative algorithm?

1

There are 1 best solutions below

2
On

You may write this as $$|\mathbf{x}\cdot\mathbf{a}_1|^2|\mathbf{x}\cdot\mathbf{a}_2|^2=|(\mathbf{x}\cdot\mathbf{a}_1)(\mathbf{x}\cdot\mathbf{a}_2)|^2$$ Minima and maxima of the parenthesized thing at the right side will be maxima of the squared version. We are thus looking to minimize

$$(\mathbf{x}\cdot\mathbf{a}_1)(\mathbf{x}\cdot\mathbf{a}_2)=\vec{x}A\vec{x}$$ where $A_{ij}=a_{1i}a_{2j}^\ast$ is a generalized projection matrix.

Findng a minimum of $$\vec{x}A\vec{x}-\lambda |x|^2$$ where $\lambda$ is a Lagrange multiplier to ensure unit length of $\vec{x}$, you arrive at the eigenvalue problem $$A\vec{x}=\lambda \vec{x}$$

Because you are maximizing the product of projections to two axes, the solution will naturally be a superposition of these two axes (any rotation out of this plane minimizes both projections).

$$\mathbf{x}=X_1\mathbf{a}_1+X_2 \mathbf{a}_2$$

This reduces this problem to a $2\times 2$ eigenvalue problem.