I have to compute the following (joint) mgf's for random variables (vectors) with the following (joint) pmf's, but I have no clue how it works. I have no background knowledge in Probability Theory and reading the theory did not bring me far.
1) $f_X (k)$ = $\frac{1}{k(k+1)}$, $k ≥ 1$
$E[e^{tX}]$ = $\sum_{k=1}^{\infty}$ $\frac{e^{kt}}{k(k+1)}$ = $-e^{-t}(-e^t+e^tlog(1-e^t)-log(1-e^t))$ by using the Taylor series of $log(1-x)$
2) $f_X (k)$ = ${n+k-1}\choose{k}$$p^n(1-p)^k$, $k ≥ 0$
3) $f_{X,Y} (i, k)$ = $(e−1)e^{−2k+1}k^i/i!$, $i, k ≥ 0$
4) $f_{X,Y} (i, k)$ = $(1−α)(β −α)α^iβ^{k−i−1}$, $0 ≤ k ≤ i$, where $0 < α < $min{1,β}
Any help would be grateful. Thanks in advance.
Hint on 2)
If a coin is thrown repeatedly with probability $p$ to land on heads and $X$ denotes the number of tails that show up before the $n$-th head shows up then $f_X$ will be the PMF of $X$.
Now be aware that we can write:$$X=X_1+X_2+\cdots+X_n$$where $X_1$ stands for the number of tails that show up before the first head, $X_2$ for the number of tails that show up between the first and the second head, et cetera.
Then the $X_i$ are iid with Geometric distribution: $P(X_i=k)=p(1-p)^k$
Now if $g(t)$ denotes the MGF of $X_1$ then we can find the MGF of $X$ on base of:$$\mathbb Ee^{tX}=\mathbb Ee^{tX_1+\cdots+tX_n}=\prod_{i=1}^n\mathbb Ee^{tX_i}=\prod_{i=1}^n g(t)=g(t)^n$$