I was asked to us R or another programming language to the plot $\sum_{j=0}^{k} P(X \geq j)$ where $P(X\geq x) = \frac{1}{1+x} $ as a function of $k.$ What happens when $k$ gets large?
I have no idea how to plot this in R. I'm assuming as $k$ gets large, we get that the probability function goes to 0 since the denominator is increasing?
Let $Q_k = \sum_{j=0}^k \frac{1}{j+1}= \frac{1}{0+1} + \frac{1}{1+1} + \cdots + \frac{1}{k+1}.$ Then I suppose you are intended to plot $Q_k$ against $k,$ for a few dozen values of $k.$
The proper mathematical answer as to what happens with increasing $k$ is given in the Comment of @Winther, but this seems to be an elementary computational or programming task. I don't see what you are expected to learn about probability from doing this particular exercise, but R can be very useful in a probability course. (At least plotting 100 points shows that $Q_k$ does not go to 0 with increasing $k$.)
Here is sample R code written at the most fundamental level I can manage. (There are more clever ways to program this that take better advantage of the structure of R. Modify as appropriate to your level.)
Table of the first few values: