Assume k is known.
> $$ f_Y(y;\theta) = \theta k^\theta \bigg(\frac{1}{y}\bigg)^{\theta + 1} \;, \quad y \ge k; \quad \theta \ge 1$$
This one, I'm not sure about these domain conditions, if someone can finish this off that would be good.
$$ L(\theta) = \prod_\limits{i=1}^{n}\theta k^\theta \bigg(\frac{1}{y_i}\bigg)^{\theta + 1} \\ = \theta^n k^{n\theta} \bigg(\prod_\limits{i=1}^{n}\frac{1}{y_i}\bigg)^{\theta + 1} \\ \text{Let } T = \ln[L(\theta)] = n\ln(\theta) + n\theta\ln(k)+(\theta+1)\sum_\limits{i=1}^{n}\ln\bigg(\frac{1}{y_i}\bigg) \\ T = n\ln(\theta) + n\theta\ln(k) - (\theta + 1)\sum_\limits{i=1}^{n}\ln(y_i) $$
I'm not really sure how to apply these domain limits in this question, and any other concepts relevant to this question that I have not been introduced to yet. Can anyone check and finish this off?
Since you wrote $L(\theta)$ rather than $L(\theta,k),$ should we take it you mean $k$ is known and not to be estimated? If you do estimate $k,$ note that $T$ is an increasing function of $k,$ so you just need to make $k$ as large as the constraints will allow. The constraints are that $k$ must be $\le$ all observations, and so $\le$ the minimum observation. So $k=\min$ is the MLE of $k.$
You have $$ \frac{\partial T}{\partial\theta} = \frac n \theta + n\ln k - \sum_{i=1}^n \ln y_i. $$ This is $\ge0$ if $\theta\le \dfrac 1 {\ln k - \frac 1 n \sum_{i=1}^n \ln y_i}$ and is $\le 0$ if $\theta\ge{}$that number. So that is the MLE of $\theta.$