i was reading proof of this theorem on http://ocw.mit.edu/courses/mathematics/18-443-statistics-for-applications-fall-2003/lecture-notes/lec23.pdf
They showed, that $\frac{v_j-np_j}{\sqrt{np_j}} \stackrel{D}{\longrightarrow} N(0,1-p_j)$. I don't understand however why
$\sum_{j=1}^r \frac{(v_j-np_j)^2}{np_j} \stackrel{D}{\longrightarrow} \sum_{i=1}^r Z_i^2$
holds?
I know that if $X_n \stackrel{D}{\longrightarrow} X$, then for every continuous function $f$ we have $f(X_n) \stackrel{D}{\longrightarrow} f(X)$, so $\frac{(v_j-np_j)^2}{np_j} \stackrel{D}{\longrightarrow} Z_j^2$. But I know as well, that it's not true that $X_n \stackrel{D}{\longrightarrow} X$ and $Y_n \stackrel{D}{\longrightarrow} Y$ imply $X_n+Y_n \stackrel{D}{\longrightarrow} X+Y$.
If $X_n$ and $Y_n$ are independent then it's true: the characteristic function of $X_n + Y_n$ is $E[e^{iu(X_n + Y_n)}] = E[e^{iuX_n}]E[e^{iuY_n}]$
Since $X_n$ and $Y_n$ converge in distribution to $X$ and $Y$, we have
$$\lim_{n \to \infty} E[e^{iu(X_n + Y_n)}] = E[e^{iuX}]E[e^{iuY}] =E[e^{iu(X + Y)}] $$
Since the characteristic function of $X_n + Y_n$ is tending to the characteristic function of $X + Y$, you can conclude that $$X_n + Y_n \stackrel{D}{\longrightarrow} X+Y$$