Consider $f:\mathbb{R}^n \rightarrow \mathbb{R}$ defined as
$$ f(x) := x^\top x + c^\top x $$
for some $c \in \mathbb{R}^n$. Define the (compact) "box"
$$X := \left\{ x \in \mathbb{R}^n \mid x_i \in [ x_i^{\min}, x_i^{\max} ] \ \forall i \in \{1, \dots , n\} \right\},$$
for some $\{ x_i^{\min}, x_i^{\max} \}_{i=1}^{n}$. Since $x^\star := \arg\min_{x \in \mathbb{R}^n} f(x) = -c/2$ (unconstrained minimizer) I am wondering if the constrained minimizer reads as $$ \arg\min_{x \in X} f(x) = \text{Proj}_{X}\left( -c/2 \right) $$ where $\text{Proj}_X$ denotes the (Euclidean) projection onto $X$.
Yes, it is. Note that your problem is completely decoupled. That is, you're really solving $n$ separate scalar optimization problems:$$\begin{array}{ll}\text{minimize} & x_i^2+c_ix_i \\ \text{subject to} & x^{\min}_i \leq x \leq x^{\max}_i\end{array}$$ Note also that $x_i^2+c_ix_i=(x_i+c_i/2)^2-c_i^2/4$. Dropping the constant and then taking the square root of the square doesn't change the solution, you're effectively solving $$\begin{array}{ll}\text{minimize} & |x_i+c_i/2| \\ \text{subject to} & x^{\min}_i \leq x \leq x^{\max}_i\end{array}$$ It should be even clearer now that your projection approach is correct.