Problem:
Given an estimator $\hat k$. The estimation method is either max likelihood or other method. We know that it's unbiased.
Let $L$ be the likelihood function and $\ell = ln L$.
Find $\Bbb Cov( \frac{d \ell}{d k},\hat k)$
My attempt:
$$\Bbb Cov( \frac{d \ell}{d k},\hat k) = E( \frac{d \ell}{d k} \cdot \hat k) - E( \frac{d \ell}{d k}) \cdot E( \hat k)$$
As unbiased $E( \hat k) = k$. Also, $\frac{d \ell}{d k}$ is score function. We know that $E(s(k)) = 0$ Hence, we have
$$E( \frac{d \ell}{d k} \cdot \hat k) - E( \frac{d \ell}{d k}) \cdot E( \hat k) =E( \frac{d \ell}{d k} \cdot \hat k) $$
I decided to take $\hat k$ into the differential.
$$=E( \frac{d \ell \cdot \hat k}{d k})$$
Since the expectation is linear, I can take the derivative out of it.
$$= \frac{d E(\ell \cdot \hat k)}{d k}$$
However, I am still stuck. Am I doing something wrong?
Note. The friend of mine did manage to solve this, so the task has enough information to be solved.
As all the commentators noted, the likelihood function makes a little sense if your estimator is obtained by, say, the method of moments. I would assume that, in this case, they don't correlate, so the covariance is zero.
Let $\hat k$ be MLE.
As unbiased, let replace the true parameter $k$ in log-likelihood function with $E[\hat k]$. Then, if you swap the derivative and expectation, you will have something like $\ell (\hat k)$. The MLE maximizes the log-likelihood function, consequently, the derivative is zero. The further steps are trivial.