Let $\|\cdot \|_F$ be Frobenius norm and $\preceq$ be Loewner order. Suppose $K, Q, R$ are $n \times n$ real symmetric matrices. I am curious about the minimizer of $\arg \min_K \|K - Q\|_F^2$ subject to $0 \preceq K \preceq R$.
The motivation for this question is to consider the projection of $Q$ on $\mathcal{D} = \{K : 0 \preceq K \preceq R\}$. Since $\mathcal{D}$ is a closed convex set, there exists a unique projector for this problem.
This problem will become easy if both $Q$ and $R$ share the same eigenbasis. Suppose $Q = U \Lambda U^T$ and $R = U \Sigma U^T$, where $U$ is orthogonal. Then $K^* = U \text{Diag}(\phi(\lambda_1), \dots, \phi(\lambda_n)) U^T$, where $\phi(\lambda_i) = \min\{\lambda_i, \sigma_i\}$.
However, is there any natural decomposition for obtaining the analytical solution to this problem when they share different eigenbasis?
Any help will be appreciated.
Here's an attempt to at least address the case where $Q,R \succ 0$.
Let $J = R^{-1/2}KR^{-1/2}$. We can express your problem as minimizing $\|R^{1/2}JR^{1/2} - Q\|_F$ subject to $$ 0 \preceq R^{1/2}J R^{1/2} \preceq R \iff 0 \preceq J \preceq I. $$ We can rewrite the objective function with $$ \|R^{1/2}JR^{1/2} - Q\|_F^2 = \\ \operatorname{Tr}([R^{1/2}JR^{1/2}]^2) - 2 \operatorname{Tr}[R^{1/2}JR^{1/2}Q] = \\ \operatorname{Tr}(R^{1/2}JRJR^{1/2}) - 2 \operatorname{Tr}[R^{1/2}JR^{1/2}Q] =\\ \operatorname{Tr}(JRJR) - 2 \operatorname{Tr}[JR^{1/2}QR^{1/2}] = \\ \operatorname{Tr}((JR)^2) - 2 \operatorname{Tr}[J\underbrace{[R^{1/2}QR^{1/2}]}_P]. $$ That is, our objective function is $f(J) = \operatorname{Tr}((JR)^2) - 2\operatorname{Tr}(JP)$, where $P = R^{1/2}QR^{1/2}$.
From there, we can try calculus. In numerator layout, we have $$ \frac{\partial f}{\partial J} = 2(RJR - P). $$ If a global minimizer exists, it (unsurprisingly) must satisfy $$ RJR - P = 0 \implies RJR = P \implies R^{1/2}JR^{1/2} = R^{-1/2}PR^{-1/2} \implies K = Q. $$ Otherwise, the minimizer must lie on the boundary of the region.
A symmetric matrix $M$ is on the boundary of $M \preceq I$ iff its largest eigenvalue is exactly $1$. The "directions" orthogonal to this boundary at $M$ are spanned by matrices of the form $xx^T$, where $x$ is an eigenvector of $M$ associated with the eigenvalue $1$. The method of Lagrange multipliers tells us that this direction must be parallel to $\frac{\partial f}{\partial J}$.
This tells us that the solution $J$ must satisfy the following condition: if $x_1,\dots,x_m$ are a complete independent set of eigenvectors of $J$ associated with the eigenvalue $1$, then it must hold that $$ P - RJR = \sum_{j=1}^m \alpha_j \,x_jx_j^\top $$ for some coefficients $\alpha_j$. Equivalently, if $V$ is a matrix whose columns span the eigenspaces of $J$ not associated with $1$, then $$ (P - RJR)V = 0 \iff V^\top(P - RJR)V = 0. $$ I have a hunch that we can find our minimizer by taking $V$ to be the eigenvectors of $P$ associated with eigenvalues that are smaller than $1$.