Lets say you want a prior distribution $p$ for a random variable $X \in \mathbb{R}$ which is known to have the mean $\mu$ and the standard deviation $\sigma$. This distribution should maximize the entropy
$$h(X) = - \int p_X(x) \log_2 p_X(x) \mathrm{d}x$$
My thoughts
The normal distribution has the probability density function
$$f(x \; | \; \mu, \sigma^2) = \frac{1}{\sqrt{2\sigma^2\pi} } \; e^{ -\frac{(x-\mu)^2}{2\sigma^2} }$$
This means the entropy is
$$ \begin{align} c &:= \frac{1}{\sqrt{2\sigma^2\pi} }\\ h(X) &= - c \int e^{ -\frac{(x-\mu)^2}{2\sigma^2} } \left (c - \log_2(e) \cdot \frac{(x-\mu)^2}{2\sigma^2} \right ) \mathrm{d} x\\ &= - c \int e^{ -\frac{(x-\mu)^2}{2\sigma^2} } c \mathrm{d} x + c \int e^{ -\frac{(x-\mu)^2}{2\sigma^2} }\log_2(e) \cdot \frac{(x-\mu)^2}{2\sigma^2} \mathrm{d} x\\ &= -c + c \frac{\log_2(e)}{2 \sigma} \int e^{ -\frac{(x-\mu)^2}{2\sigma^2} } \cdot (x-\mu)^2 \mathrm{d} x\\ \end{align} $$
but I don't think this is helpful to prove that the normal distribution maximizes the entropy, given that we know the distribution has a mean of $\mu$ and a standard deviation of $\sigma$.