Let $(X_1,…,X_n)$ be a random vector with $0<\prod_{j=1}^n\text{Var}(X_j)<∞$.
Let $\mathbf{Q}=(\mathbf{q}_{1},…,\mathbf{q}_{n})=(ρ_{jk}^2)_{n×n}$, where $ρ_{jk}$ is the Pearson correlation coefficient between $X_j$ and $X_k$. How to prove or disprove
$$\mathbf{1}^\top\mathbf{Q}^+\mathbf{Q}=\mathbf{1}^\top$$
where $\mathbf{Q}^+$ is the Moore-Penrose inverse of $\mathbf{Q}$ and $\mathbf{1}^\top$ is a row vector of ones?
$$$$
It seems that when any row or column vectors in $\mathbf{Q}$ are linearly dependent, they must be equal. If it is true, let $\mathbf{H}_{n×n}=(\mathbf{h}_1,…,\mathbf{h}_r,\mathbf{0},…,\mathbf{0})^\top$ be the reduced row echelon form of $\mathbf{Q}$, where $r=\text{rank}(\mathbf{Q})$; and there is $\mathbf{h}_{j}^\top\mathbf{h}_{k}=0$ for all $j\neq k$.
Then, let the column indices of the leading ones in each nonzero rows of $\mathbf{H}$ be $j_1,…,j_r$, and let
$$\mathbf{F}_{n×r}=(\mathbf{q}_{j_1},...,\mathbf{q}_{j_r}),\; \mathbf{G}_{r×n}=(\mathbf{h}_1,...,\mathbf{h}_r)^\top$$
According to rank factorization from reduced row echelon forms, we have $\mathbf{Q}=\mathbf{FG}$
According to construction of Moore–Penrose inverse by rank decomposition, we have
$$\mathbf{Q}^+=\mathbf{G}^\top(\mathbf{GG^\top})^{-1}(\mathbf{F^\top F})^{-1}\mathbf{F}^\top$$
Thus, \begin{equation} \begin{split} & \mathbf{1}^\top\mathbf{Q}^+\mathbf{Q} = \mathbf{1}^\top\mathbf{G}^\top(\mathbf{GG^\top})^{-1}\mathbf{G} \\ & = \mathbf{1}^\top \begin{bmatrix} \mathbf{h}_1 & \cdots & \mathbf{h}_r \end{bmatrix} \begin{bmatrix} \mathbf{h}_1^\top\mathbf{h}_1 & \cdots & \mathbf{h}_1^\top\mathbf{h}_r \\ \vdots & \ddots & \vdots \\ \mathbf{h}_r^\top\mathbf{h}_1 & \cdots & \mathbf{h}_r^\top\mathbf{h}_r \\ \end{bmatrix}^{-1} \begin{bmatrix} \mathbf{h}_1^\top \\ \vdots \\ \mathbf{h}_r^\top \\ \end{bmatrix} \\ & = \mathbf{1}^\top\sum_{i=0}^r (\mathbf{h}_i^\top\mathbf{h}_i)^{-1} \mathbf{h}_i \mathbf{h}_i^\top = \mathbf{1}^\top \end{split} \end{equation}
where $\mathbf{h}_i^\top\mathbf{h}_i$ is the number of ones in $\mathbf{h}_i$ and $\mathbf{h}_i\mathbf{h}_i^\top$ is an $n×n$ block diagonal matrix with main-diagonal blocks of either ones or zeros.
Therefore, the question may turn into how to prove or disprove that, when any rows in $\mathbf{Q}$ are linearly dependent, they must be the same.
Schur product theorem tells that $\mathbf{Q}$ is positive semi-definite. Then by the property of Moore–Penrose pseudoinverse, $\mathbf{Q}^{+}\mathbf{Q} = \mathbf{Q}\mathbf{Q}^{+}$ is the orthogonal projection onto the range of $\mathbf{Q}$, yielding the following equivalence:
\begin{align*} \mathbf{1}^{\top}\mathbf{Q}^{+}\mathbf{Q} = \mathbf{1}^{\top} &\quad\iff\quad \mathbf{1} \in \operatorname{im}\mathbf{Q} = (\ker \mathbf{Q})^{\perp} \\ &\quad\iff\quad (\ker\mathbf{Q}) \perp \mathbf{1} \end{align*}
We will establish the last relation using a probabilistic argument.
Proof. Replacing each $X_i$ by its standardization if necessary, we may assume $\mathsf{Var}(X_i) = 1$ for each $i$. Let $\mathbf{\Sigma}_X$ be the covariance matrix of $X$. Also, define the random vectors $\tilde{X}$ and $Y$ by
$$ \tilde{X} \sim \mathcal{N}(\mathbf{0}, \mathbf{\Sigma}_X) \qquad\text{and}\qquad Y = \tilde{X}^{\circ 2} = (\tilde{X}_1^2, \ldots, \tilde{X}_n^2)^{\top}, $$
where $\circ$ denotes Hadamard/entrywise product. Then, as in the proof of Schur product theorem, we know that the covariance matrix $\mathbf{\Sigma}_Y$ of $Y$ is given by $ \mathbf{\Sigma}_Y = 2\mathbf{\Sigma}_X^{\circ 2} = 2\mathbf{Q} $. So, it suffices to show
$$ (\ker \mathbf{\Sigma}_Y) \perp \mathbf{1}. $$
To this end, assume $\mathbf{v} \in \ker \mathbf{\Sigma}_Y$. Then
\begin{align*} \mathbf{v} \in \ker \mathbf{\Sigma}_Y &\quad\implies\quad 0 = \mathbf{v}^{\top}\mathbf{\Sigma}_Y\mathbf{v} = \mathsf{Var}(\mathbf{v}^{\top}Y) \\ &\quad\iff\quad \mathbf{v}^{\top}Y = \mathsf{E}[\mathbf{v}^{\top}Y] = \mathbf{v}^{\top}\mathbf{1} \quad \text{a.s.} \\ &\quad\iff\quad \mathbf{v}^{\top}(\mathbf{x}^{\circ 2}) = \mathbf{v}^{\top}\mathbf{1} \quad \text{for any } \mathbf{x} \in \operatorname{im}\mathbf{\Sigma}_X \end{align*}
In particular, plugging $\mathbf{x} = \mathbf{0}$ shows that $\mathbf{v}^{\top}\mathbf{1} = 0$ and hence $\mathbf{v} \perp \mathbf{1}$. $\square$
Addendum. This argument actually proves a more general statement: