The random sample we have is $X_1,\cdots, X_n$ iid with exponential distribution with parameter ${1}$. I want to find the conditional distribution of $X_1$ with respect to $S=X_1+\cdots+X_n$. I know that the distribution of $S$ is the gamma with parameters $(n,1)$, but I don't know how to find the conditional distribution.
Can anyone compute this conditional expected value?
78 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 2 best solutions below
On
Since $S_n = X_1+X_2+\ldots+X_n=X_1+Y_{n-1}$, where $X_1$ and $Y_{n-1}$ are independent, $Y_{n-1}\sim\Gamma_{n-1,1}$, we can write joint pdf of $X_1$ and $Y_{n-1}$ as $$f_{X_1,Y_{n-1}}(x,y) = e^{-x} \frac{1}{(n-2)!}y^{n-2}e^{-y}, \quad x,y>0.$$
Now $S_n=X_1+Y_{n-1}$ and $X_1=X_1$ is a linear transform of the pair $(X_1, Y_{n-1})$. Inverse transform is $X_1=X_1$ and $Y_{n-1}=S_n-X_1$.
The joint pdf of $(X_1, Y_{n-1})$ can be transformed to the joint pdf of $(X_1,S_n)$ by $$ f_{X_1,S_n}(u,v) = f_{X_1,X_1+Y_{n-1}}(u,v) = f_{X_1,Y_{n-1}}(x(u,v),\;y(u,v))\;|J| = f_{X_1,Y_{n-1}}(u,\;v-u)\;|J| $$ where $J$ is a Jacobian of inverse transform: $x=u, y=v-u$
$$J=\begin{vmatrix} \dfrac{\partial x}{\partial u} & \dfrac{\partial x}{\partial v} \\ \dfrac{\partial y}{\partial u} & \dfrac{\partial y}{\partial v} \\ \end{vmatrix} =\begin{vmatrix} 1 & 0 \\ -1 & 1 \\ \end{vmatrix} = 1. $$
Therefore,
$$ f_{X_1,S_n}(u,v) = f_{X_1,Y_{n-1}}(u,\;v-u) = e^{-u} \frac{1}{(n-2)!}(v-u)^{n-2}e^{-(v-u)} $$ $$ = \frac{1}{(n-2)!}(v-u)^{n-2}e^{-v} \text{ for $0< u < v$}. $$ Next consider conditional pdf: $$ f_{X_1|S_n}(u|v) = \dfrac{f_{X_1,S_n}(u,v)}{f_{S_n}(v)} = \dfrac{\frac{1}{(n-2)!}(v-u)^{n-2}e^{-v}}{\frac{1}{(n-1)!}v^{n-1}e^{-v}} = (n-1)\left(1-\frac{u}{v}\right)^{n-2}\cdot \frac1v $$ for $0<u<v$. This conditional distribution is a scaled Beta distribution. Indeed, if $X$ is Beta distributed with parameters $(1,n-1)$, then $Z=vX$ has the above pdf. It can be also described as Beta distribution with $4$ parameters $(1,n-1,0,v)$.
If you really search for conditional expectation instead of conditional distribution, this way is too complicated and you should follow the way of @Math1000.
Let $X_n$ be a sequence of i.i.d. random variables and put $S_n = \sum_{i=1}^n X_i$. No matter the distribution of $X_1$, we have by definition of conditional expectation and linearity $$ S_n = \mathbb E[S_n\mid S_n] = \sum_{j=1}^n\mathbb E[X_1\mid S_n] = n\mathbb E[X_1\mid S_n], $$ and hence $$ \mathbb E[X_1\mid S_n] = \frac 1n S_n. $$