Problem statement: I am trying to find an upper bound and lower bound of $D$;
$D = \int\limits_0^\infty {\frac{1}{{{r^2}}}\int_0^{2\pi } {\left\{ {g(r,\theta )\ln g(r,\theta )} \right\}} f\left( {r,\theta } \right){\mkern 1mu} dr{\mkern 1mu} d\theta } $
Where, $g(r,\theta ) = Q\left( { - (r\cos \theta + 1)} \right)Q\left( { - (r\sin \theta + 0.5)} \right);0 \le |g(r,\theta )| \le 1$ and $Q(x) = \frac{1}{{\sqrt {2\pi } }}\int\limits_x^\theta {{e^{ - \frac{{{t^2}}}{2}}}dt} ; r\ge 0$.
${f\left( {r,\theta } \right)}$ is the joint unknown probability distribution function of $r, \theta$;
Solution: Before start solving the problem I am explaining my point of understanding,
${f\left( {r,\theta } \right)}$ is the joint pdf. On my understanding, it is hard to find an upper and lower bound on $D$ for unknown joint distribution. Can we come up with inequality on D for independent $r$ and $\theta$ such as ${f\left( {r,\theta } \right)}=f(r)f(\theta)$.
If step (1) is possible; we need to take the integral of ${g(r,\theta )\ln g(r,\theta )}$ over $0$ to $2\pi$. Can we say that uniform/other distribution will make $\int_0^{2\pi } {\left\{ {g(r,\theta )\ln g(r,\theta )} \right\}f(\theta )d\theta }=h(r)$ which will make the problem easier to find the upper bound and lower bound of $D$?
Now the form of $D$ can be written as, $D = \int\limits_0^\infty {\frac{1}{{{r^2}}}h(r){\mkern 1mu} dr} $. By taking the derivative of ${\frac{1}{{{r^2}}}h(r)}$ equal to zero and solving for $r$, we can find the upper and lower bound on $D$.
I am not sure if this is the right way of doing that. I have never seen such a problem or similar type of problems especially when the pdf $f(r,\theta)$ is unknown. It will be helpful if you give me any hint or suggestion or solving strategy to solve this problem.