In the Fisher information, do we consider joint probability distribution. I have referred Thomas M. Cover's Elements of Information theory. The authors mention that -
$$FI(\theta) = E_{\theta}({\frac{\partial }{\partial \theta} {\ln f(X; \theta)})^2}$$
Does $f(X, \theta)$ refers to the joint probability distribution or the conditional probability distribution?