Consider a DMC channel where the input signal $X$ is added to a noise signal $Z$. The noise $Z$ is uniformly distributed over the interval $(-\frac{a}{2}, \frac{a}{2})$, where $a$ is a positive real number. $X$ is distributed over the interval $(-\frac{1}{2}, \frac{1}{2})$. The output of the channel is given by
$$Y = X + Z.$$
Our goal is to find the channel capacity, which is the maximum of the mutual information $I(X; Y)$. Through some deductions, this problem can be transformed into finding the maximal entropy $h(Y)$. So our problem is equivilent to
$$\max_{p(x)} h(Y) \quad \text{s.t.} \quad X \in [-\frac{1}{2}, \frac{1}{2}], \; Z \in [-\frac{a}{2}, \frac{a}{2}], Y = Z+X.$$
It's obvious that $Y\in [-1/2-a/2,1/2+a/2]$ and in some trivial cases, like $a = 1/m$, it's easy to achieve Y as uniform distribution (We can choose X to be some discrete distribution where there are no overlap between the intervals). But for a general case, I have no idea how to do it.
There are some simple ideas:
at least we have some lower bound like log(a),
suppose that X is a continuous r.v., the pdf function of Y is the convolution of X and Z, imagine Z like a window go through X. For a < 1, we can follow the idea of trivial case (a = 1/m), and since I want to maximize the entropy, I hope Y can be uniformly distributed. The problem is maybe there is some residue part.
for a > 1, the distribution seems like a trapezoid with curly sides.