I'm teaching a measure theory class. I think one of the main motivations for the development of the Lebesgue integral is that the space $L^1(\mathbb{R})$ of integrable functions on $\mathbb{R}$ is complete under the $L^1$ norm. You can't get this if you stick to the Riemann integral.
For further motivation, I would like to find some interesting applications of this fact; preferably elementary.
For instance, is there some interesting function $f$ that one can construct by writing down a sequence of approximates $f_n$, showing that the sequence is $L^1$-Cauchy, and letting $f$ be its limit? Ideally, it would not be obvious how to construct $f$ otherwise.
I can think of lots of interesting applications of the completeness of $L^2$ (e.g. Fourier series) but not so much $L^1$.
[I have edited my answer. I don't think this is the best possible response but it is somewhat reasonable.]
Many motivate the Lebesgue integral over the Riemann integral by stating (loosely) that the former has "better limit theorems." That is a bit misguided I think. They have exactly the same limit theorems (from one point of view) except that the Lebesgue integral integrates more functions. Better yet any limit theorem for the Riemann integral usually has the extra assumption that the limit function is integrable, while the Lebesgue limit theorem has the conclusion that the limit function is integrable.
The question, however, is to sell the Lebesgue integral on the basis that $L_1(\mathbb{R})$ is complete while $R_1(\mathbb{R})$ is not. This is similar to the way we sell the transition from $\mathbb{Q}$ to $\mathbb{R}$. So I have two pitches on this that might work for some students.
A. In a search for motivation it might be better to ask the master. Lebesgue very specifically said that his motivation for defining and studying a generalization of the Riemann integral was an example, published by Volterra in 1881, of a bounded derivative that is not Riemann integral.
If there is a differentiable Lipschitz function $F:[0,1]\to\mathbb{R}$ but it is totally illegal to write $$\int_0^1 F'(t)\,dt = F(1)-F(0)$$ then something is very seriously wrong with your integration theory. No eighteenth century mathematician would have hesitated to write this identity and the Riemann integral forbids it!
To see what is happening with Volterra's example consider the sequence of functions $$f_n(t)=\frac{F(x+1/n)-F(x)}{1/n}$$ which is uniformly bounded and converges pointwise to $F'$.
Now according to the standard theory of the Lebesgue integral any uniformly bounded sequence of functions in $L_1(\mathbb{R})$ that converges pointwise is Cauchy. So the sequence $\{f_n\}$ converges to a function $f$ in that space. Since it is also pointwise convergent to $F'$ we would know that $f=F'$ almost everywhere. A direct computation then shows that $$F(1)-F(0)= \lim_{n\to\infty}\int_0^1 f_n(t)\,dt = \int_0^1 F'(t)\,dt. $$
[Skeptical student says: It's a cheat. We didn't use the Cauchy sequence to find the function $f$, we were already given a construction of $F$ and $F'$ anyway. That's like giving a Cauchy sequence $\{q_n\}$ from $\mathbb{Q}$ that converges to $\sqrt2$. I already know $\sqrt2$ and I don't need a Cauchy sequence to tell me what it is.]
B. Try again. Let's give the following problem to some mathematicans from the 18th, 19th, and 20th century.
18th century mathematician: I don't know what "uniformly " means and who is "Cauchy?" But the answer is clear using term-by-term differentiation which has never failed me. $$F'(x) = \sum_{k=1}^\infty f'_k(x).$$
19th century mathematician: Nonsense. Every student knows that you simply must have uniform convergence of the differentiated series. Uniform convergence of the series $ \sum_{k=1}^\infty f_k(x)$ says nothing about either the pointwise convergence or uniform convergence of $ \sum_{k=1}^\infty f'_k(x).$
20th century mathematician: Not at all. We have a Cauchy sequence of continuous functions in $L_1([0,1])$. That provides exactly the function
$$ \sum_{k=1}^\infty f'_k $$ defined as a function in $L_1([0,1])$ and will prove to be my derivative $F'$. With a little more work (standard in our century) I can also prove that, at almost every point $x$, $$F'(x) = \sum_{k=1}^\infty f'_k(x). $$
So we have a nice differentiation formula in the space $$F' = \sum_{k=1}^\infty f'_k $$ as well as a pointwise almost everywhere formula $$F'(x) = \sum_{k=1}^\infty f'_k(x) .$$
[Skeptical student says: Not impressed! I like the 18th century guy.]