So I worked out how to find the maximum likelihood estimator of
$f(x, \alpha) = \alpha x^{-\alpha-1}$ for $x > 0,$ where $\theta > 0$ is a parameter is:
$$L(\alpha) = \prod_{i=1}^n \alpha x_i^{-\alpha-1} = \alpha^n \prod_{i=1}^n x_i^{-\alpha-1}$$
Then:
$$\log L(\alpha) = n\log(\alpha) + \sum_{i=1}^n (-\alpha-1) \log(x_i)$$
So:
$$n\log(\alpha) + (-\alpha-1) \sum_{i=1}^n \log(x_i)$$
However I'm not sure how to find the likelihood estimator for
$$f(x;\theta) = \frac{1}{2(\theta x)^{1/2}} \text{ for } 0<x\le\theta$$
Any Advice would be very appreciated!
Note that for the second problem if $\theta < x$, then $f(x,\theta)=0$. In particular $$ \sup_{\theta>0}f(x;\theta)=\sup_{\theta\geq x} f(x,\theta)=\frac{1}{2\sqrt{x}}\sup_{\theta\geq x} \frac{1}{\sqrt{\theta}}=\frac{1}{2\sqrt{x}}\frac{1}{\sqrt{x}}=f(x,x) $$ Thus $\hat{\theta}(x)= x$ for a sample of size $1$. For a sample of size $n$, a similar approach can be taken. In fact in that case $\hat{\theta}(\mathbf{x})=\max(x_1,\dotsc,x_n)$.