I'm trying to show
$$\int_0^1 f(x)\;dx=0$$ where $f(x)$ is $1$ on $\frac{1}{n}$, $n\in\mathbb{N}^+$, and zero everywhere else. Here is my attempt:
Proceed by induction. Consider the integral sum $\int_{1}^1 f(x)\;dx$. Then this is just zero. Now suppose $\int_{\frac{1}{n}}^1f(x)\;dx=0$. Then $\int_{\frac{1}{n+1}}^1f(x)\;dx=\int_{\frac{1}{n+1}}^{\frac{1}{n}}f(x)\;dx+\int_{\frac{1}{n}}^1f(x)\;dx=\int_{\frac{1}{n+1}}^{\frac{1}{n}}f(x)\;dx$ by induction hypothesis. Now consider a partion $P$ of $\Big[\frac{1}{n+1},\frac{1}{n}\Big]$. Then since $f(x)$ is only nonzero at the endpoints, $f(x)$ is only nonzero in at most two subintervals. Let $\epsilon>0$ and $\delta=\epsilon/2$. Then $$||P||<\delta\implies|0-R(f,P)|=R(f,p)\leq 2\cdot||P||<2\delta=\epsilon$$ Hence, $\int_{\frac{1}{n+1}}^{\frac{1}{n}}f(x)\;dx=0$.
Now here is the part I'm not sure about. Can I then say
$$\int_0^1 f(x)\;dx=\int_0^0 f(x)\;dx + \sum_{n=1}^\infty\int_{\frac{1}{n+1}}^{\frac{1}{n}}f(x)\;dx=0$$ since each $\int_{\frac{1}{n+1}}^{\frac{1}{n}}f(x)\;dx=0$?
As you suspect, using series and (Riemann) integrals is a touchy subject. Let me suggest a safer and shorter alternative.
You have correctly proven that for all $n \in \Bbb N \setminus \{0\}$
$$\int \limits _{\frac 1 n} ^1 f(x) \ \Bbb d x = 0 .$$
Keeping in mind that $f \le 1$ and that $\int _a ^b f \le \int _a ^b g$ whenever $f \le g$ on $[a,b]$, we may write then that
$$\int \limits _0 ^1 f(x) \ \Bbb d x = \lim _{n \to \infty} \int \limits _0 ^1 f(x) \ \Bbb d x = \lim _{n \to \infty} \Bigg( \int \limits _0 ^{\frac 1 n} f(x) \ \Bbb d x + \underbrace{ \int \limits _{\frac 1 n} ^1 f(x) \ \Bbb d x } _{=0} \Bigg) = \lim _{n \to \infty} \int \limits _0 ^{\frac 1 n} f(x) \ \Bbb d x \le \\ \lim _{n \to \infty} \int \limits _0 ^{\frac 1 n} 1 \ \Bbb d x = \lim _{n \to \infty} \frac 1 n = 0 .$$