Proximal operator of the difference of negative entropy and squared norm

184 Views Asked by At

Let $\lambda >0$. I am looking for an analytic expression of \begin{equation} \mathrm{prox}_{\lambda\psi} (x), \end{equation} where $\mathrm{prox}$ denotes the proximal operator, and $\psi \colon \mathbb{R}^n \longrightarrow \mathbb{R}$ is defined for every $x=(x_i)_{1\leq i \leq n}\in \mathbb{R}^n$ by \begin{equation} \psi(x) = \left\{\begin{array}{cc} \sum_{i=1}^n x_i \ln (x_i) - \frac{x_i^2}{2} & \text{if } x\in \mathcal{U}, \\+\infty & \text{otherwise}, \end{array}\right. \end{equation} where $\mathcal{U}$ is the simplex of $\mathbb{R}^n$.

In the case $\lambda=1$, it can be shown (see https://arxiv.org/pdf/1808.07526.pdf Example 2.23 for instance) that $\mathrm{prox}_{\psi}$ is the softmax operator. In the case $\lambda\neq 1$, I am unable to reproduce a similar proof.

Any help is welcomed !