Likelihood Function is a random variable

1.2k Views Asked by At

I am taking my first statistics course and we are talking about finding a good estimate to unknown parameter $\theta$ given a sample $X_{1},...X_{n}\sim F_{\theta}$, and we talked about the likelihood function $L(\theta)$ and he claimed the maximum of likelihood function $L(\theta)$ is a random variable and the lecturer gave the example of two consecutive coin tosses. So we have $(T,T),(T,H),(H,T),(H,H)$. Assuming the probability of getting $T$ is an unknown parameter $p$. Then we will have to maximize $p^2,p(1-p),(1-p)^2$ respectively to find which $p$ is more likely to give us the given sample space and as a result we had

$\hat{p}=\begin{cases} 1 & \left(T,T\right)\\ \frac{1}{2} & (H,T)\,or\,(T,H)\\ 0 & \left(H,H\right) \end{cases}$

But I didn't really understand how to see this as random variable. I mean if I look at $P(\hat{p}=1)=p^2$, it doesn't really makes sense since p is an unknow parameter. Now we are talking about given two estimations (two random variables), which gives a better estimation (Mean Squarred Error etc.) and we are usng the fact that the estimations are random variables, but since I don't get why it's a random variable it creates a problem for my understanding. Can someone explain why this is a random variable, or if I am misunderstanding something? Thanks

2

There are 2 best solutions below

7
On BEST ANSWER

A random variable $\displaystyle X\colon \Omega \to \mathbb R$ is a measurable function from the set of possible outcomes $\displaystyle \Omega$ to the real line.

So we every result of your random experiment (within the sample space $\Omega$) is mapped to a real-valued estimate of the unknown parameter $\theta$ through the optimization of the likelihood function.

0
On

Since $\theta$ is a parameter (unknown), any function (non constant) of $\theta$ is a random variable.

$L(\theta)$ is just a function of $\theta$ !