Probability generating function and a discrete random variable

1.6k Views Asked by At

A discrete random variable $X$ has probability generating function $G_X(t)$. If $Y=aX+b$ show that the probability generating function of $Y$ is given by $G_X(t)=t^bG_X(t^a)$. Hence prove that $E(Y)=aE(X)+b$ and that Var$(Y)=a^2$Var$(X)$.

I'm currently learning probability generating functions and I am confused with this question. I've attempted to express $G_Y(t)$ using its definition, but I'm not sure where to go from there. I'm not even sure I know how to approach this question. I appreciate any help- thank you!

2

There are 2 best solutions below

2
On

Recall that $$G_X(t) = \operatorname{E}[t^X].$$ Thus $$G_Y(t) = \operatorname{E}[t^{aX+b}] = \operatorname{E}[(t^a)^X t^b] = t^b \operatorname{E}[(t^a)^X] = t^b G_X(t^a),$$ as claimed, because $t$ and $b$ are constants with respect to the expected value. For the second part, how is the probability generating function related to the moments of a random variable?

1
On

(Stumbled upon this question by chance. Apologies for bringing up this again.)
We define the probability generating function of $X$ as $G_X(t)=\sum_{k=0}^{\infty}P(X=k)t^k$.
The probability generating function of $Y$ is $G_Y(t)=\sum_{k=0}^{\infty}P(Y=k)t^k. $ Since $Y=\alpha X+\beta, Y\in \{\beta, \alpha+\beta,\ldots\}. $ Then $$\begin{align} G_Y(t) & =\sum_{k=0}^{\infty}P(Y=k)t^k =\sum_{n=0}^{\infty}P(Y=\alpha n +\beta)t^{\alpha n +\beta}\\ & =\sum_{n =0}^{\infty}P(X=n)t^{\alpha n +\beta} = \sum_{n =0}^{\infty}t^{\beta} P(X=n)t^{\alpha n} = t^{\beta}G_X(t^{\alpha}). \end{align}$$