Okay, I can somehow understand why average slope of a curve is defined in such a fashion: Let $f$ be a continuous function on an interval $[a,b]$. Then, we define its average over $[a,b]$ to be the number $$ f_{\operatorname{avg}} := {1 \over b-a} \int_a^b f(t) dt $$ Let $y = F(x)$ be a curve. Fix the starting point $x_0$ and the ending point $x_1=x_0+\Delta x$. Then, the slope at a point $z$ is the derivative $F'(z)$. Thus, the "average slope'' should mean \begin{equation*} (F')_{\operatorname{avg}} = {1 \over x_1 - x_0}\int_{x_0}^{x_1} F'(t) dt = {F(x_1) - F(x_0) \over x_1-x_0}. \end{equation*}
To summarize: "Avg. slope = Sum all slopes of all tangent lines over an interval $[a,b]$ and divide it by the number of slopes over that same interval." But right side of the equation somehow bothers me. I don't get intuitevely how average slope (sum of all slopes/number of slopes) equals slope of a function $F(x)$ between two endpoints $x_0$ and $x_1$.
The right side is the slope of the line drawn between $(x_0,F(x_0))$ and $(x_1,F(x_1))$. I would see that as the definition of average slope. Averaging an infinite number of local slopes is more problematic. The right side only depends on two values of the function.