Assume that $X_{1}, \ldots, X_{n}$ are independent and identically distributed exponential with parameter $\lambda>0$. Furthermore, let $Y=\sum_{i=1}^{n} X_{i}$ follows as $Gamma(n,\lambda)$ distribution, for which the PDF is given:
$$ f(x \mid \lambda)=\left\{\begin{array}{ll} \lambda e^{-\lambda x}, & \text { if } x>0 \\ 0, & \text { otherwise } \end{array}\right. $$
Compute the joint probability between $X_1$ and $Y$.
Question:
We have independent between $X_1$ and $Y$?
$X_1$ and $Y$ are not independent. Since $Y - X_1 = \sum_{i = 2}^n X_i$ and $X_i, X_j$ independent for $i \ne j$ we have that $X_1$ and $Y - X_1$ are independent. This would mean $$ \text{Cov} \left( X_1, Y - X_1 \right) = 0 $$ But then, $$ \text{Cov} \left( X_1, Y \right) = \text{Cov} \left( X_1, Y - X_1 \right) + \text{Cov} \left( X_1, X_1 \right) = \text{Var}\left[ X_1 \right] \ne 0 $$ So $X_1$ and $Y$ are not independent.
To compute the probability distribution of $(X_1, Y)$ you will want to condition on $X_1$. It is intuitive that for fixed $x$, $f_{Y \mid X_1} (y \mid x)$ will be the probability density function of a Gamma distribution with parameters $n - 1, \lambda$, translated $x$ units in the $x$-direction. This is because
$$\mathbf{P} \left( Y \le y \mid X_1 = x \right) = \mathbf{P} \left( \sum_{i = 1}^n X_i \le y \, \big\rvert \, X_1 = x \right) = \mathbf{P} \left( \sum_{i = 2}^n X_i \le y - x \right)$$
and $\sum_{i = 2}^n X_i$ follows a gamma distribution with parameters $n - 1, \lambda$. Hence,
$$f_{Y \mid X_1} \left( y \mid x \right) = \frac{\lambda^{n - 1} (y - x)^{n - 1} e^{-\lambda(y - x)}}{\Gamma(n)}$$
Then the joint probability density function is
$$f_{Y, X_1} (x, y) = f_{Y \mid X_1} (y \mid x) f_X(x) = \frac{\lambda^n (y - x)^{n - 1} e^{-\lambda y}}{\Gamma(n)}$$
for all $y \ge x \ge 0$.