I'm trying to learn the subject of Maximum Likelihood Estimation by myself.
I'm facing one of the first questions in the textbook which is:
`The waiting time (in minutes) on a queue to the dentist is the random variable $X$ with the following pdf:
$$f(x) = \left\{\begin{matrix} 2\theta xe^{-\theta x^2} & x > 0\\ 0 & x \leq 0 \end{matrix}\right.$$
- Find a maximum likelihood estimator for $\theta$ based on the waiting times of $n$ people that are waiting for the dentist, $X_1, X_2, ... X_n$. (The formula)
- Find the maximum likelihood estimation in a model of 3 people that were waiting 20, 50 and 30 minutes. (The exact number).
Again, I am not interested in the exact answers but a way of looking and thinking of questions like this.
I tried sketching the graph of $f(x)$ as $f(\theta, x)$ but I found it difficult.
In general, the likelihood is $$ L(\theta; x_1, x_2, \dots, x_n)=f(x_1, x_2, \dots, x_n ;\theta)=\prod_{i=1}^n f(x_i; \theta) $$ where I assumed independence (which is not explicitly stated here).
What you want to sketch then is the likelihood. Or the log likelihood, which is very common to work with (due to invariance property of MLEs). In this case, that is $$ \log L(\theta; x_1, x_2, \dots, x_n)=\log\left(\prod_{i=1}^n 2\theta x_ie^{-\theta x_i^2}\right)=n\log2+n\log\theta+\sum_{i=1}^n\log x_i-\theta\sum_{i=1}^nx_i^2. $$ To find the maximum likelihood estimator, you differentiate this with respect to $\theta$, set it to 0 and solve for $\theta$. If it's a global maximum, it's your MLE. Once you get used to this method, it is very formulaic.
To find the when you have data, you simply set $n=3$ and plug in your $x_i$'s in your estimator.