Let $X_{1},X_{2},\ldots,X_{n}$ be a sample from a population with density $p(x,\theta)$ given by \begin{align*} p(x,\theta) = \frac{1}{\sigma}\exp\left\{-\left(\frac{x-\mu}{\sigma}\right)\right\} \end{align*}
if $x\geq \mu$ and $0$ otherwise. Here $\theta = (\mu,\sigma)$ with $-\infty < \mu < \infty$ and $\sigma > 0$.
(a) Show that $\min\{X_{1},X_{2},\ldots,X_{n}\}$ is sufficient for $\mu$ when $\sigma$ is fixed.
(b) Find a one-dimensional sufficient statistic for $\sigma$ when $\mu$ is fixed.
(c) Exhibit a two-dimensional sufficient statistic for $\theta$.
MY ATTEMPT
(b) In the first place, let us determine the maximum likelihood function for this sample considering that $\mu$ is fixed: \begin{align*} L(\textbf{x},\theta) = \prod_{i=1}^{n}\frac{1}{\sigma}\exp\left\{-\left(\frac{x_{i}-\mu}{\sigma}\right)\right\} = \frac{1}{\sigma^{n}}\exp\left\{-\frac{1}{\sigma}\left(\sum_{i=1}^{n}x_{i} - n\mu\right)\right\} \end{align*}
In this case, we can factor $L(\textbf{x},\theta) = h(x)g_{\sigma}(T(\textbf{x}))$, where \begin{align*} h(x) = 1\quad\text{and}\quad g_{\sigma}(T(\textbf{x})) = \frac{1}{\sigma^{n}}\exp\left\{-\frac{1}{\sigma}\left(\sum_{i=1}^{n}x_{i} - n\mu\right)\right\} \end{align*}
Therefore the statistic $T(\textbf{x}) = \sum_{i=1}^{n}x_{i}$ is sufficient for $\sigma$.
But I do not know if this is right neither do I know how to approach the other two items. Can somebody help me out? Thanks in advance!
By Neyman-Fisher Lemma, $T(\boldsymbol{x})$ is a Sufficient Statistics for $\mu$ if and only if there exist two non negative functions $g(T(\boldsymbol{x}), \mu)$ and $h(\boldsymbol{x})$ such that:
$$L(\boldsymbol{x}, \mu) = g(T(\boldsymbol{x}), \mu)h(\boldsymbol{x})$$
Note that I am directly considering only $\mu$ as parameter since $\sigma$ is known. We have that:
$$L(\boldsymbol{x}, \mu) = \frac{1}{\sigma^{n}}exp \left \{ -\frac{1}{\sigma}\sum_{i = 1}^{n}x_{i} \right \}exp\left \{ \frac{n\mu}{\sigma} \right \}I(x_{(1)} > \mu)$$
Hence, letting $T(\boldsymbol{x})$ $=$ $min \left \{ X_{1},...,X_{n} \right \}$ $\equiv$ $X_{(1)}$ and:
$$g(T(\boldsymbol{x}), \mu) = exp\left \{ \frac{n\mu}{\sigma} \right \}I(x_{(1)} > \mu)$$ and: $$h(\boldsymbol{x}) = \frac{1}{\sigma^{n}}exp \left \{ -\frac{1}{\sigma}\sum_{i = 1}^{n}x_{i} \right \}$$
we can conclude the minimum to be the Sufficient Statistics. In general, when writing the joint distribution, recall also to include the support of the joint distribution. In particular, if the variables are iid and each of them has to be greater than the parameter, then in the support of the joint we will have that the minimum of them has to be greater than it, since this implies the condition that all of them are. Similarly, if instead in the support of the single (iid) variable we have that it has to be smaller than the parameter, then in the joint we will have that only the maximum will have to be smaller than it: these two logical facts are useful to find the Minimal Sufficient as well.