I have $f(1)=0$ and $f'(x)=1/x$. Consider Taylor series of $f(1+x)$ centered at $x=0$, I need to show that it converges to $f(1+x)$.
I got $f(1+x) = \sum (-1)^{n}*x$. I don't know what to do from here and I am not sure what "Taylor series of $f(1+x)$ converges to $f(1+x)$" means.
You say:
I am guessing that is not what you have done. From what you know about $f(1+x)$, you can build a related series, known as the Taylor series centered at $0$ (or Maclaurin series for short). If you do this correctly, you get $$\sum_{n=1}^{\infty}(-1)^{n-1}\frac{x^n}{n}$$ But you do not yet have the right to say that $f(1+x)$ equals this series. In fact it doesn't. But for $x$-values in the appropriate window, which turns out to be $(-1,1]$, this series does converge to the same value that $f(1+x)$ takes. This is what you are being asked to prove: that for $x\in(-1,1]$, $\sum_{n=1}^{\infty}(-1)^{n-1}\frac{x^n}{n}$ converges to $f(1+x)$. Once we all understand this we can be loose and say $f(1+x)=\sum_{n=1}^{\infty}(-1)^{n-1}\frac{x^n}{n}$ for $x\in(-1,1]$.
The way to prove such things is to understand and use Taylor's inequality.