As the title suggests, I'm really struggling to derive the likelihood function of the poisson distribution (mostly down to the fact I'm having a hard time understanding the concept of likelihood at all).
I've watched a couple videos and understand that the likelihood function is the big product of the PMF or PDF of the distribution but can't get much further than that. The question is as follows:
"Random variables $X_1, \dots, X_n$ are independent and identically distributed (IID) from a $Poisson(θ)$ distribution. Suppose a random sample $x = (x_1, \dots, x_n)$ has been observed.
(a) Write down the likelihood function $L(θ)$ based on the observed sample."
If anyone could help show me the process for deriving the likelihood function I would really appreciate it. Thanks!

For a Poisson random variable $X$, the probability mass function (PMF) is given by: $$P(X=x|\theta)=f(x)=e^{-\theta} \frac{\theta^x}{x!},\ \ x\in \{0,1,\ldots,\infty\},\theta>0$$
As for the likelihood function, considering an i.i.d. (independent and identically distributed) sample $x_1, x_2,\ldots,x_n$, from a Poisson variable,
$$L(\theta|x_1,x_2,\ldots,x_n)=P(X=x_1|\theta)P(X=x_2|\theta)\cdots P(X=x_n|\theta)$$
$$L(\theta|x_1,x_2,\ldots,x_n)=e^{-\theta} \frac{\theta^{x_1}}{x_1!}\cdots e^{-\theta} \frac{\theta^{x_n}}{x_n!}=e^{-n\theta}\frac{\theta^{x_1+x_2+\ldots+x_n}}{x_1!x_2!\cdots x_n!}$$ $$L(\theta|x_1,x_2,\ldots,x_n)=e^{-n\theta}\frac{\theta^{\sum_{i=1}^n x_i}}{\prod_{i=1}^n x_i!}$$
Now, for the log-likelihood: just apply natural log to last expression.
$$\ln L(\theta|x_1,x_2,\ldots,x_n)=-n\theta + \left(\sum_{i=1}^n x_i\right)\ln \theta - \ln(\prod_{i=1}^n x_i!).$$
If your problem is finding the maximum likelihood estimator $\hat \theta$, just differentiate this expression with respect to $\theta$ and equate it to zero, solving for $\hat \theta$. As $\theta$ is not present in the last term you can easily find that $$\hat \theta=\frac{\sum_{i=1}^n x_i}{n}.$$