I have the following (real) matrix which I need to be positive-semidefinite,
$P = \begin{bmatrix} P_1 & -\frac{1}{2}(P_1+P_2)\\-\frac{1}{2}(P_1+P_2) & P_2\end{bmatrix} \succeq 0$,
where $P_1, P_2 \in \mathbb{R}^{n\times n}$ and $P_1, P_2 \succeq 0$.
I think this is only the case when, $P_1 = P_2$, but I couldn't find a way to prove this (only for the case where $n=1$, through the eigenvalues). I was therefore wondering if this is even the case and if so how to prove this.
(I already asked this question before but forgot to mention that $P_1$ and $P_2$ are matrices instead of scalars).
Let $\mathbf x=\pmatrix{u+v\\ v}$. Then $\mathbf x^TP\mathbf x=u^TP_1u+u^T(P_1-P_2)v$.
If $P$ is positive semidefinite, $u^T(P_1-P_2)v$ must be zero for any $u$ and $v$ (otherwise we may scale $v$ by a large signed factor to make $\mathbf x^TP\mathbf x$ negative). Hence $P_1=P_2$.
Conversely, if $P_1=P_2$, then $\mathbf x^TP\mathbf x=u^TP_1u\ge0$ and hence $P$ is positive semidefinite.