On page 107 of book 'The Concrete Tetrahedron' by Manuel Kauers . Peter Paule
A Hypergeometric probability distribution is given : consider an urn containing N balls, m green ones and $N - m$ blue ones. If we select a ball at random, we will clearly hit a green ball with probability $ m/N$ and a blue ball with probability $ (N - m)/N$. In general, if we select n balls, then the probability of k green balls is
\begin{gather}\frac{ \binom{m}{ k} \binom{N-m}{n- k}} {\binom{N}{n}}\end{gather} This much I understand. Next gives the mean
\begin{gather}\sum_{k=0}^n k\frac{ \binom{m}{ k} \binom{N-m}{n- k}} {\binom{N}{n}}=\frac{nm}{N} \end{gather}
which I understand why it's calculated that way but don't understand how he so easily gets the rhs and mainly why he says Zeilberger's algorithm will do the last step for us. What is Zeilberger about it ? I read his description of Zeilberger's algorithm prior and some example(s) but can hardly understand much if any about it. Perhaps if someone can explain it wrt this it may help. Before Zeilberger was there any other way to get the rhs in closed form for any $m,N,n$ ??
Next he calculates the variance
\begin{gather}\sum_{k=0}^n( k-\frac{nm}{N})^2\frac{ \binom{m}{ k} \binom{N-m}{n- k}} {\binom{N}{n}}=\frac{mn(N-n)(N-m)}{N^2(N-1)} \end{gather} and says again Zeilberger's algorithm saves us from having to perform any hand calculations. How is this and again before Zeilberger was there any other way to get the rhs in closed form for any $m,N,n$ ?
I'm not sure about Zeilberger's algorithm but we can calculate the mean and variance as follows.
If $X$ is a hypergeometric random variable with the parameters defined as above then \begin{align*} E[X] &= \sum_{k=0}^n k\frac{\binom{m}{k}\binom{N-m}{n-k}}{\binom{N}{n}}\\ &= \sum_{k=1}^n k\frac{\binom{m}{k}\binom{N-m}{n-k}}{\binom{N}{n}}\\ &= \sum_{k=1}^n k\frac{\frac{m(m-1)!}{(m-k)!k(k-1)!}\binom{N-m}{n-k}}{\frac{N(N-1)!}{(N-n)!n(n-1)!}}\\ &= \sum_{k=1}^n n\frac{m}{N}\frac{\binom{m-1}{k-1}\binom{N-m}{n-k}}{\binom{N-1}{n-1}}\\ &= n\frac{m}{N}\sum_{k=1}^n\frac{\binom{m-1}{k-1}\binom{N-m}{n-k}}{\binom{N-1}{n-1}}\\ &= n\frac{m}{N}. \end{align*} The last equality follows from the summation equalling $1$ since it is the sum over all the probabilities.
Then $Var[X] = E[X^2] - E[X]^2 = E[X(X-1) + X] - E[X]^2 = E[X(X-1)] + E[X](1-E[X])$. Now \begin{align*} E[X(X-1)] &= \sum_{k=0}^n k(k-1)\frac{\binom{m}{k}\binom{N-m}{n-k}}{\binom{N}{n}}\\ &=\sum_{k=2}^n k(k-1)\frac{\binom{m}{k}\binom{N-m}{n-k}}{\binom{N}{n}}\\ &=\sum_{k=2}^n (k-1)\frac{m\binom{m-1}{k-1}\binom{N-m}{n-k}}{\binom{N}{n}}\\ &=\sum_{k=2}^n \frac{m(m-1)\binom{m-2}{k-2}\binom{N-m}{n-k}}{\binom{N}{n}}\\ &=\sum_{k=2}^n \frac{m(m-1)\binom{m-2}{k-2}\binom{N-m}{n-k}}{\frac{N}{n}\frac{N-1}{n-1}\binom{N-2}{n-2}}\\ &= \frac{m(m-1)n(n-1)}{N(N-1)} \end{align*} so \begin{align*}Var[X] &= \frac{m(m-1)n(n-1)}{N(N-1)} + \frac{mn}{N}\left(1- \frac{mn}{N}\right)\\ &= \frac{Nm(m-1)n(n-1) + mnN(N-1) - m^2n^2(N-1)}{N^2(N-1)}\\ &= \frac{mn(N(m-1)(n-1) + N(N-1) - mn(N-1))}{N^2(N-1)}\\ &= \frac{mn(N^2 - Nm - Nn + mn)}{N^2(N-1)}\\ &= \frac{mn(N-m)(N-n)}{N^2(N-1)}. \end{align*}