Using method of maximum likelihood find the estimator for $\mathcal N(m,1)$-normal distribution and $\mathcal U(\theta, 1), \theta<0$
From what I understand, if the parameter is negative it is done a little differently than, if it were positive... I don't see how or why, because normally the estimator is $$m=\frac{1}{n}\sum_{k=1}^{n}X_k$$ for $\mathcal N(m,1).$
For $\mathcal U(\theta, 1), \theta<0$ I was thinking that $$L\left(\theta\mid{\bf x}\right)=\frac{1}{(1+\theta)^n}$$ and since $\theta<x_1,\ldots,x_n$ that the estimator should be $$\min\{0,Y_1\}$$ where $$Y_1=\min_{1 \leq k \leq n}\{x_k \}$$ Im supposed to check also whether this is centered($\min\{0,Y_1\}$).(I was also told that this estimator is correct, just not sure about $L\left(\theta\mid{\bf x}\right).$)
I'd express the likelihood by saying $$ L(\theta) = \begin{cases} \dfrac 1 {(1+\theta)^n} & \text{if }\theta\le Y_1, \\[6pt] 0 & \text{if }\theta > Y_1. \end{cases} $$ As $\theta$ increases, so does $L(\theta)$, until $\theta$ reaches $Y_1$, after which $L(\theta)$ is $0$. Hence the MLE is $Y_1$.
For the "normal" problem, as in the case when the parameter space is the whole real line, the density becomes $$ (x_1,\ldots,x_n) \mapsto c\cdot \exp\left( \frac {-1} 2 \left(n(\bar x - m)^2 + ns^2\right) \right). $$ As a function of $m\in(0,\infty)$ with $x_1,\ldots,x_n$ fixed this is $$ m \mapsto L(m) = c\cdot \exp\left( \frac {-1} 2 \left(n(\bar x - m)^2 + ns^2\right) \right). $$ The problem now is to find the value of $m$ within the parameter space that makes $L(m)$ as big as possible. Since $m$ appears only in $(\bar x - m)^2$, and $L(m)$ is a decreasing function of $(\bar x-m)^2$, the problem is to find the value of $m$ in the parameter space that makes $(\bar x - m)^2$ as small as possible. If $\bar x<0$, that is done simply by making $m=\bar x$, since in that case $(\bar x-m)^2=0$. However, if $\bar x\ge 0$, then there is no minimizing value of $m$ in the space $(-\infty,0)$. However there is one in $(-\infty,0]$. And that is the value of $m$ that makes $m$ as close as possible to $\bar x$ subject to the constraint that $m\in(-\infty,0]$. The value of $m\in(-\infty,0]$ that is closest to $\bar x\ge0$ is $0$. Hence that is the MLE if the parameter space is $(-\infty,0]$ and $\bar x\ge 0$.