$$f(x \mid \theta, \sigma) = \dfrac{1}{\sigma}e^{\frac{-(x-\theta)}{\sigma}},$$
where $x > \theta$, $-\infty \leq \theta \leq\infty$, $\sigma > 0$.
I took two cases, first when $\sigma$ is unknown and second when $\theta$ is unknown. I then merged these two and found that MLE of $\theta$ is $x_{1}$ that is minimum ordered statistic and MLE of $\sigma$ I found is $\dfrac{\sum_{i=1}^{n}(x-\theta)}{n}$ and I substituted $\theta=x_{1}$ Did i do it right? Please someone tell me.
The approach you have used is fine, and what you are essentially doing is getting MLE equations for each parameter in terms of the other one, and then solving these as simultaneous equations to get the result. This can be framed most easily as using standard calculus optimisation, with the normal working for obtaining MLEs for IID observations. Given $n$ observations with sample mean $\bar{x}$, the log-likelihood for your problem is:
$$\ell_\boldsymbol{x}(\theta, \sigma) = - n \ln \sigma - \sum_{i=1}^n \frac{x_i - \theta}{\sigma} = - n \ln \sigma - \frac{n}{\sigma}(\bar{x} - \theta) \quad \quad \text{for } \min x_i \geqslant \theta.$$
The corresponding score equations are:
$$\begin{equation} \begin{aligned} \frac{\partial \ell_\boldsymbol{x}}{\partial \theta}(\theta, \sigma) &= \frac{ n}{\sigma} \quad \quad \quad \quad \quad \quad \quad \quad \text{ for } \min x_i > \theta, \\[8pt] \frac{\partial \ell_\boldsymbol{x}}{\partial \sigma}(\theta, \sigma) &= \frac{ n}{\sigma^2} \Big[ (\bar{x} - \theta) -\sigma \Big] \quad \quad \text{for } \min x_i > \theta, \\[8pt] \end{aligned} \end{equation}$$
From the first equation we see that the log-likelihood is strictly increasing in $\theta$ and so $\hat{\theta} = \min x_i$. From the second score equation we then have $\hat{\sigma} = \bar{x} - \hat{\theta} = \bar{x} - \min x_i$.