I stumbled on an approximation I found surprising while working on a bipartite entanglement entropy problem (which isn't particularly relevant). Alas, I got the following messy result:
$f(x) = -\cos^2(x) \log(\cos^2(x)) -\sin^2(x) \log(\sin^2(x))$
where $\log$ is the natural log measuring entropy in nats. I noticed this function is approximated surprisingly well by:
$h(x) = \log(2) | \sin(2x) |^{2^\gamma}$
where $\gamma \approx 0.5772156649...$ is the Euler–Mascheroni (oily macaroni) constant.
Did a very quick brute force optimization in scipy using mean squared error to roughly estimate the constant, so might not be exact. Overall this seemingly different function does a surprisingly good job (desmos visual here) at approximating the original complicated one. Also the approximation seems much better at the peaks near log(2) than the troughs near 0. Any insights for why?

The maximum of the function being at $x=\frac \pi 4$, perform a series expansion to obtain $$f(x)=\log(2)+-2 \left(x-\frac{\pi }{4}\right)^2+\frac{4}{3} \left(x-\frac{\pi }{4}\right)^4+O\left(\left(x-\frac{\pi }{4}\right)^8\right)$$
Do the same with $$h(x)=\log(2) \sin^a(2x)$$
$$h(x)=\log (2)-2a\log(2) \left(x-\frac{\pi }{4}\right)^2 +\frac{2a (3 a-2)}{3} \log(2) \left(x-\frac{\pi }{4}\right)^4 +O\left(\left(x-\frac{\pi }{4}\right)^6\right)$$
Comparing the first coefficient $a=\frac 1 {\log(2)}$
Compare the infinite norms (using the definition of $f(x)$)
$$\Phi_n=\int_0^{\frac \pi 2} \Big(f(x)-\log(2) \sin^{a_n}(2x)\Big)^2\,dx$$
$$a_1=\frac 1 {\log(2)} \qquad \implies \qquad \Phi_1=6.776\times 10^{-5}$$
$$a_2=2^\gamma \qquad \implies \qquad \Phi_2=1.600\times 10^{-5}$$
Now, using optimization, $$a_{\text{opt}}=1.49392\qquad \implies \qquad \Phi_{\text{opt}}=1.592\times 10^{-5}$$
For sure, you are the best !
For the fun (thanks to the $ISC$) $$a_{\text{opt}}\sim \cos \left(\frac{\pi }{5}\right)+J_0(1){}^{\sqrt{2}}$$