Is there a closed-form solution to the following convex problem?
$$ \min_{x\in\Delta}\quad x^\top c+\|x\|_2 $$
where $\|\cdot\|_2$ is the $L_2$ norm and $\Delta=\{x|x\ge0,x^\top1=1\}$ is the probability simplex.
If not, how about if we further know that $c\ge0$ or $c\le0$?
In general, there will not be a closed form solution (even under the assumptions you suggested), because this problem is an SOCP of the form: \begin{align*} \min \quad c^\top x+t\\ \text{s.t.} \quad e^\top x=1,\\ \Vert x \Vert_2 \leq t,\\ x \geq 0, \end{align*} where $e$ is a vector of all $1$s, and the constraint $x \geq 0$ will prevent us from obtaining closed-form solutions (you can verify this by staring at the KKT conditions).
Edit: as pointed out in the comments, we can take the dual of this problem, which is given by: \begin{align*} \max_{\lambda \in \mathbb{R}, \mu \in \mathbb{R}_{+}^n} \quad & \lambda\\ \text{s.t.} \quad & \Vert e\lambda+\mu-c\Vert_2 \leq 1, \end{align*} and use the observation that a best choice of $\mu_i$ for a given $\lambda$ is $\mu_i^*=0$ if $\lambda-c_i \geq 0$ and $\mu_i^*=c_i-\lambda$ otherwise, so this gives the optimal solution in terms of a single parameter $\lambda$, and allows us to recover $x^*$ via the KKT condition $\frac{1}{\Vert x^*\Vert_2}x^*=\mu^*+e\lambda^*-c$ (although you will need to do line search on $\lambda$).
Similarly, if we replace $\Vert x\Vert_2$ with $\Vert x\Vert_2^2$, the optimal value becomes equivalent to the optimal value of the following quadratic program: \begin{align*} \max_{\lambda \in \mathbb{R}, \mu \in \mathbb{R}_{+}^n} \quad \lambda -\frac{1}{4}\Vert e\lambda+\mu-c\Vert_2^2, \end{align*} and, as before, we can recover an optimal $x^*$ via the KKT condition $x^*=\frac{1}{2}(e\lambda^*+\mu^*-c)$. Unfortunately, this still isn't closed form (because of the $x\geq 0$ constraint),
You may be able to say a bit more if you make some additional assumptions on $c$.