Assume that we have a uniformly distributed $X \sim \mathcal{U}(-\alpha,\alpha)$ and a random variable $Y$ with a yet-to-be-determined distribution defined on the same interval as $X$, i.e. $[-\alpha,\alpha]$.
For Z = X + Y, which distribution of $Y$ maximizes the mutual information $I(Z;Y)$ if $X$ and $Y$ are assumed to be stochastically independent?
My attempts so far (exploiting that $X$ and $Y$ are independent): $$ \begin{align} I(Z;Y)&=H[Z]-H[Z|Y]\\ &=H[Z]-H[X+Y|Y]\\ &=H[Z]-H[X|Y]\\ &=H[Z]-H[X]\\ &=H[Z]-\log(2\alpha) \end{align} $$
Furthermore, the range of $Z$ is given by $[-2\alpha,2\alpha]$, enabling us to upper bound $H[Z]$ by the entropy of the uniform distribution $\mathcal{U}(-2\alpha,2\alpha)$, hence $H[Z] \leq \log(4\alpha)$ must hold. This, in turn, implies $I(Z;Y) \leq \log(4\alpha) - \log(2\alpha) = \log(2)$.
I found the solution. One has to define $f_Y$ as a linear combination of Dirac distributions placed at $-\alpha$ and $\alpha$. Then we can prove that $Z$ is a uniform RV on $[-2\alpha,2\alpha]$ by integrating ("convolution").