Here is the definition of mutual information
$$I(X;Y) = \int_Y \int_X p(x,y) \log{ \left(\frac{p(x,y)}{p(x)\,p(y)} \right) } \, \mathrm d x \, \mathrm d y$$
where $X$ and $Y$ are two random variables, $p(x)$ and $p(y)$ are their PDFs, and $p(x,y)$ is the joint PDF.
I am wondering what is the derivative of $I(x;y)$ with respect to $X$, or $Y$? Namely,
$$\frac{\mathrm d}{\mathrm dX} I(X;Y) = \, ?$$ $$\frac{\mathrm d}{\mathrm dY} I(X;Y) = \, ?$$
Thanks.
The notation $I(X;Y)$ does not mean that the information is a function of the variable $X$. In that sense, it's just a number, hence its derivative is zero. If you want an analogy, think of the expectation of a random variable $E(X)$ : it does not "depend on" $X$, it's a number; it would not make sense to ask about $d E(X)/dX$
In other sense, you could say that it's a function (or a functional) of the probability densities - but this is not what you are after.