I have a question about the convergence of infinite product.
In real mathematics analysis page 198 question 63,
part a) $a_k$=$(-1)^k$/${\sqrt{k}}$, and we need to show series $\sum\limits_{n=1}^\infty a_k$ converges, but $\prod\limits_{n=1}^\infty(1+a_k)$ diverges.
The convergence part is true by alternating series test. While for the product part, if $k$ is $1$, $a_k$ is $-1$, and $1+a_k$ is $0$, which means any partial product or even the total infinite product will always be $0$. So does this zero value implies the divergence here?
part b) uses $b_k$=$e_k$/$k$+$(-1)^k/{\sqrt{k}}$, where $e_k$ is $0$ when $k$ is odd, $1$ when $k$ is even. So I think if this is how the $b_k$ is defined, the infinite product $\prod\limits_{k=1}^\infty(1+b_k)$ will also be $0$, since if $k$ is $1$, $e_1$ is $0$, and $b_1$=$-1$. $1+b_1$=$0$. But part b is to show series diverges, while infinite product converges.
So can someone tell me where is my mistake? Thx!
Here is the question from the textbook.
