We take a distribution $D = {x_1,...,x_n}$ from a diamond shaped area with length of $\theta$ and diagonals of $2\theta$.
I have the following density function : $f_{\theta }\left( x\right) = \dfrac{1}{2\theta^2}$ if $||x||_1 = |x_1| + |x_2| \leq \theta$ ($x_1$ and $x_2$ are the axis of the diagram containing the diamond shape) and $0$ otherwise.
The density function means that if the norm of a point x is inferior to $\theta$ then our point x is within the diamond shape. I want to find which value of the parameter $\theta$ makes it more likely that this is the case for all the points in my distribution D.
We want to find $\theta_{MLE} = \begin{aligned} argmax \\ \theta \in \mathbb{R} \end{aligned} f_{\theta }\left( x_1,x_2,x_3,...,x_n\right)$ (We assume independence).
To make it easier to calculate I use the log to get sums instead of products :
$\theta_{MLE} = \begin{aligned} argmax \\ \theta \in \mathbb{R} \end{aligned} \sum log (f_{\theta }\left( x_i\right))$ (We assume independence).
I let $L(\theta) = log(f_{\theta }\left( x_i\right))$
Then I do the following partial derivative: $\dfrac{\partial L(\theta)}{\partial \theta} = -\dfrac{1}{\theta^3}$
Now, I let $f'_{\theta }\left( x\right) = -\dfrac{1}{\theta^3}$ if some condition regarding and $0$ otherwise.
Can I assume that
$\theta_{MLE} = \begin{aligned} argmax \\ \theta \in \mathbb{R} \end{aligned} \sum log (f'_{\theta }\left( x_i\right))$ is the maximum likelihood estimation for the parameter $\theta$? If not, what went wrong in my approach?
The reason I'm using $argmax$ is because we try to find the maximum likelihood of $\theta$ for each of the points in D. The greatest value of the maximum likelihood overall for the distribution D is the greatest value of the maximum likelihood of a point in D that is greater than all the other values likelihood estimation we have for other points in D. That's why we keep it by doing argmax.
For $\theta>0$, suppose the random vector $(X,Y)$ has density
$$f_\theta(x,y)=\begin{cases}\frac1{2\theta^2}&,\text{ if }|x|+|y|\le \theta \\ 0 &, \text{ otherwise} \end{cases}$$
This describes a uniform distribution on the region $\left\{(x,y)\in \mathbb R^2:|x|+|y|\le \theta\right\}$, which is what you call the diamond shaped area.
Suppose you are want to find maximum likelihood estimator (MLE) of $\theta$ based on a random sample of $n$ paired observations $(X_1,Y_1),\ldots,(X_n,Y_n)$ drawn from the distribution $f_{\theta}$. This means the vectors $(X_1,Y_1),\ldots,(X_n,Y_n)$ are all independently distributed with common density $f_\theta$.
The likelihood function given this sample is then the joint density of $(X_1,Y_1),\ldots,(X_n,Y_n)$:
\begin{align} L(\theta)&=\prod_{i=1}^n f_\theta(x_i,y_i) \\&=\begin{cases}\frac1{(2\theta^2)^n} &,\text{ if }|x_1|+|y_1|\le \theta,\ldots,|x_n|+|y_n|\le \theta \\ 0 &, \text{ otherwise}\end{cases} \\&=\begin{cases}\frac1{(2\theta^2)^n} &,\text{ if }\max\limits_{1\le i\le n}(|x_i|+|y_i|)\le \theta \\ 0 &, \text{ otherwise}\end{cases} \end{align}
This is a decreasing function of $\theta$, so $L(\theta)$ reaches its maximum for the minimum possible value of $\theta$ and that value is your MLE. Differentiation is not required for solving this optimization problem; in fact the likelihood is not differentiable at $\theta=\max\limits_{1\le i\le n}(|x_i|+|y_i|)$.
The support of the parent distribution $f_\theta$ depends on the unknown parameter $\theta$, so you simply cannot ignore it. The joint support of $(X_1,Y_1),\ldots,(X_n,Y_n)$ practically determines the parameter space having observed the sample.