Find orthogonal projection matrix $P$ of rank $r$ which maximizes $\mathbb E_x\left[\frac{\|Px\|}{\|x\|}\right]$

124 Views Asked by At

Let $x$ be a random vector in on the unit sphere in $\mathbb R^p$ with mean $\mu$ and covariance matrix $\Sigma$. Let $r \in \{1,2,\ldots,p\}$.

Question. Find an orthogonal projection matrix $P \in \mathbb R^{p \times p}$ of rank $r$ which maximizes $\mathbb E_x\left[\|Px\|\right]$.

Notes. I have a strong feeling the the singular-value decomposition of $\Sigma$ has a role to play here...


Edit: solution for modified objective: $E_x[\|Px\|^2]$

As kindly pointed out by J.G in the comments, modifying the objective to $E_x[\|Px\|^2]$ might lead to a easier problem.

Indeed,

$$ \arg\max_P E_x\|Px\|^2 = \arg\max_P \text{trace}(P\Sigma) + \mu^T\Sigma\mu = \arg\max_P \text{trace}(P\Sigma), \tag{1} $$

which is a linear-programming problem on the space of orthogonal matrices of rank $r$. Most importantly, the solution problem only depends on the covariance matrix $\Sigma$ of the random variable $x$.

As for an analytic solution of problem (1), one recognizes this problem an instance of principal component analysis (PCA), and can be solved by taking $P = V_rV_r^T$, where $V_r \in \mathbb R^{p \times r}$ is the matrix whose columns are the leading $r$ eigenvectors (normalized to have unit norm) of $\Sigma$.

1

There are 1 best solutions below

6
On

Take for example $x$ to be a vector of 2 iid Gaussian variables with mean 0 and variance 1. If you take $r=1$, then any projection gives you the same average, since you can orthogonally base change to a fixed projection.

Now let $x$ be a vector of 2 independent random variables, where the first is a zero-norm unit-variance Gaussian random variable, and the second is a discrete random variable with mean zero and unit variance (that is, it takes values in {1,-1} with probability 1/2). Now the projections to each coordinate give different expectations.