This shouldn't be too hard, but I'm stuck. Suppose $f$ is a concave function on the interval $[a,b]$, meaning $$\lambda f(x) + (1-\lambda) f(y) \leq f(\lambda x + (1-\lambda) y)$$ for every $x,y \in [a,b]$ and every $\lambda \in [0,1]$. I want to prove that for any $p, q, r \in [a,b]$ with $q \geq r \geq 0$ we have: $$f(p + q) + f(p - q) \leq f(p + r) + f(p - r)$$
This inequality comes up in a paper that I'm reading on random walks, and in that context the function $f$ is piecewise linear. So I'm not willing to assume that $f$ is differentiable, but piecewise smooth is fine if it helps (I don't think it will).
Let's calculate first the specific $\lambda$ that "averages" the endpoints $p\pm q$ to give $p\pm r$, i.e. solve for $\lambda$,
\begin{align} \lambda (p+q) + (1-\lambda)(p-q) = \pm r \\ \implies \lambda_{\pm} = \frac {\pm r+q}{2q} \tag{1} \end{align}
Now, using the fact that $q\geq r\geq 0\implies p \pm r\in[p- q,p+q]$,
\begin{align} f(p\pm r) &= f( \lambda_\pm(p+q) + (1-\lambda_\pm)(p-q) )\\ &\geq \lambda_\pm f(p+q) + (1-\lambda_\pm)f(p-q) \\ \end{align}
Summing these and substituting the values in $(1)$,
\begin{align} f(p+r)+f(p-r) &\geq (\lambda_++\lambda_-)f(p+q) + (2-\lambda_+-\lambda_-)f(p-q) \\ &= f(p+q) + f(p-q) \end{align}
Aside
You shouldn't be assuming $p,q,r\in[a,b]$, but rather that $p\pm q\in[a,b]$