I am wondering, whether MLE really cares whether it operates on proper distributions. Lets take a look at the following situation:
likelihood: $$L(\theta \mid x) = \prod_{n}^{N}{f(x_n \mid \theta)}$$ or log-likelihood: $$L(\theta \mid x) = \sum_{n}^{N}{ \log\bigl( f(x_n \mid \theta) \bigr)}$$
Now, lets say that the $f(x \mid\theta)$ is not proper probability distribution, e.g. $$\int_{-\infty}^{\infty}{f(x \mid \theta)}dx \neq1$$
then I can normalize it as follows:
$$f_{ \text{norm}}(x, \mid \theta) = \frac{f(x, \mid \theta)}{\int_{-\infty}^{\infty}{f(x \mid \theta)}dx}$$
where the $\int_{-\infty}^{\infty}{f(x \mid \theta)}dx = K = \text{const}$. Then, when I maximize the MLE:
$$\underset{\theta}{\operatorname{argmax}}\, L(x,\theta)$$
The $K$ is irrelevant in maximization, such that maximizing with or without $K$ still gives the same $\theta$.
Does the above prove the MLE doesnt care whether it operates on proper distribution function, as long as $0\leq f(x,\theta)$ ?
Which implies that any non-negative measure of distance, can be used for MLE (that is, the $f(x,\theta)$ can be skewed or stretched or with multiple modes, etc), is that correct ?
EDIT:
From the comments I have realized one thing, the $f(x \mid \theta) \leq M$, where $M$ is finite. This refers to the question, where I assume that I take a proper pdf and stretch it (not multiply), add modes or skew it. This is not as specific as it may seem, because what happens is that I take a Normal distribution, and make it's mean $\mu$ as a function of another variable $\mu(z)$ and use MLE over the variable $z$.
The "EDIT" does not extend the problem, only explains and justifies the assumption.