Let $X_1,\,\ldots,\,X_n$ be independent random variables, each with probability density function $$f(x;\,\theta)=\frac{2x}{\theta^2}\qquad \text{for }0<x<\theta.$$
I want to find the maximum likelihood estimator for this using the likelihood function. My lecturer goes through the solution as such $$L(\theta;\,x) = \prod_{i=1}^n \frac{2x_i}{\theta^2} = \frac{2^n}{\theta^{2n}}\prod_{i=1}^n x_i\propto\theta^{-2n}\mathbb{I}(\theta>x_n)$$ where $\mathbb{I}$ denotes the indicator function.
$$\mathbb{I}(A) = \begin{cases}1,&\text{if $A$ is true,}\\0,&\text{if $A$ is false.}\end{cases}$$
What I'm confused is how we get this proportionality. I can't seem to link the product with the indicator function and condition started within the function. Can someone explain this, thanks.
For every $\mathbf x=(x_1,\ldots,x_n)$, $$L(\theta;\,\mathbf x) = \prod_{i=1}^n \frac{2x_i}{\theta^2}\mathbb{I}(\theta>x_i) = K(\mathbf x)\theta^{-2n}\mathbb{I}(\theta>m(\mathbf x)),$$ where $m(\mathbf x)$ and $K(\mathbf x)$ depend on $\mathbf x$ but not on $\theta$ since $$m(\mathbf x)=\max\limits_ix_i,\qquad K(\mathbf x)=2^n\prod_{i=1}^nx_i.$$ In this sense, $$L(\theta;\,\mathbf x) \propto \theta^{-2n}\mathbb{I}(\theta>m(\mathbf x)),$$ hence, to maximize $L(\ ;\mathbf x)$, one should find $\theta\gt m(\mathbf x)$ such that $\theta^{-2n}$ is maximal.
This is not possible because of the $\gt$ sign in $\mathbb{I}(\theta>m(\mathbf x))$ but one could have used instead the densities $$f(x;\theta)=\frac{2x}{\theta^2}\mathbb{I}(\theta\geqslant x),$$ leading to the MLE $$\hat\theta(\mathbf x)=m(\mathbf x)=\max\limits_ix_i.$$