I'm not an expert in operator theory (so I'm going to be very informal sorry), but I would like to be given some advice about a problem I have. Let $f$ be a function defined in $C^{\infty}(\mathbb{R})$ and such that $f(x) > 0, f^{(1)}(x) < 0, f^{(2)}(x) > 0 \; \forall x \in \mathbb{R}$. Let also $\left\{ x_i \right\}_{i \in \mathbb{Z}}$ a sequence such that $x_{i+1} - x_{i} = h$. I define the following difference equation
$$ \beta_{i+2} + 4\beta_{i+1} + \beta_{i} = 6 f(x_i) $$
I'm not interested in the specific solution of the equation, I'm interested in some properties of the solution. Now I'm trying to study what relationship there is between $\beta_{i+1}$ and $\beta_i$ and I'm more specifically interested in the sign $\Delta \beta_i = \beta_{i+1} - \beta_{i}$. If I define $E^k \beta_i = \beta_{i+k}$ (which implies $\Delta = E - I$, I can rewrite the original equation as
$$ \begin{multline} \left( E^2 + 4 E + I \right) \beta_i = 6 f(x_i) \Rightarrow \beta_i = 6 \left( E^2 + 4 E + I \right)^{-1} f(x_i) \Rightarrow \\ \Delta \beta_i = 6 (E - I)(E^2 + 4E + I)^{-1} f(x_i) \end{multline} $$
I've read in a book a while ago that it is possible to define under some conditions a bijection between this kind of operators and rational complex function, so I've tried to reduce my problem to the study of
$$ g(E) = 6 (E - I)(E^2 + 4E + I)^{-1} \leftrightarrow g(z) = \frac{6 (z - 1)}{(z^2 + 4z + 1)}. $$
In $g(z)$ I observe that
$$ \begin{multline} g(z) = \frac{A}{z-z_0} + \frac{B}{z - z_1} \leftrightarrow g(E) = \\ A\left(E-z_0\right)^{-1} + B\left(E - z_1\right)^{-1} = \\ A'\left(I - c_0 E\right)^{-1} + B' \left(I - c_1 E \right) \end{multline} $$
This would mean that $$ \begin{multline} \Delta \beta_i = \sum_{j=0}^{+\infty} \left( A' c_0^j + B' c_1^j \right)E^j f(x_i) \Rightarrow \\ \Delta \beta_i = \sum_{j=0}^{+\infty} \left( A' c_0^j + B' c_1^j \right)f(x_{i+j}) \Rightarrow \\ \Delta \beta_i = \sum_{j=0}^{+\infty} \left( A' c_0^j + B' c_1^j \right) \sum_{k=0}^{+\infty} \frac{f^{(k)}(x_{i})}{k!} (jh)^k \Rightarrow \\ \Delta \beta_i = \sum_{j=0}^{+\infty} \sum_{k=0}^{+\infty} \left( A' c_0^j + B' c_1^j \right) \frac{f^{(k)}(x_{i})}{k!} (jh)^k \end{multline} $$
Assuming the double summation converge I finally have
$$ \beta_{i+1} = \beta_{i} + S_i $$
So I could study the sign of the difference by studying the sign of $S_i$, the series sum.
Does this approach make any sense? Or is it too twisted for the problem? Assuming the approach make sense can you suggest me what to look at in literature to have a better background on the subject?
I of course have several problems in what I've said so far, for example I haven't mentioned any space here, my symbolic calculus is also ill posed, given the fact I haven't dealt with convergence problem of the series, neither I've studied norm of the operator $E$, which does require the definition of a normed space I haven't defined.
Not exactly an answer, but a possible approach. It should be mentioned that the title is somewhat misleading, because operator theory in the usual sense is not related to the problem. I also took the liberty to slightly change the setting.
Consider a function $f:\mathbb{R}\to\mathbb{R}$, which is (say) $10$ times continuously differentiable on a neighborhood $J$ of the interval $[0,1]$. Also put $n\in\mathbb{N}$ a large parameter, set $h = \frac{1}{n}$, and let $x_i=\frac{i}{n} = ih$, $i \in \mathbb{Z}$. Now, let $\beta_i$ be the solution of the following recurrence relation $$ \frac16 (\beta_{i+1} + 4 \beta_i + \beta_{i-1}) = f(x_i),\quad 0 \le i \le n, $$ with boundary conditions $$ \beta_{1} - \beta_{-1} = 2h f^\prime(0), \quad \beta_{n+1} - \beta_{n-1} = 2h f^\prime(1). $$
The form of the recurrence relation suggests that $\beta_i$ should be approximately equal to $f(x_i)$. I will argue that for $h$ sufficiently small $\beta_i = f(x_i) - \frac16 h^2 f^{\prime\prime} (x_i) + O(h^4)$, for $0 \le i \le n$.
To show this, consider the sequence $\gamma_i = \beta_i - f(x_i) + \frac16 h^2 f^{\prime\prime} (x_i)$. This sequence satisfies the following recurrence relation $$ \frac16 (\gamma_{i+1} + 4 \gamma_i + \gamma_{i-1}) = f(x_i) - \frac16 (f(x_{i+1}) + 4 f(x_i) + f(x_{i-1})) + \frac{h^2}{6} (f^{\prime\prime}(x_{i+1}) + 4 f^{\prime\prime}(x_i) + f^{\prime\prime}(x_{i-1})). $$ Plugging in the definition of $x_i$, and using Taylor's theorem, we find $$ \frac16 (\gamma_{i+1} + 4 \gamma_i + \gamma_{i-1}) = O(h^4) \cdot \max_{t\in J}\{|f^{(4)}(t)|\}, \quad 0 \le i \le n. $$ Similarly we find the boundary conditions $$ \gamma_{1} - \gamma_{-1} ,\, \gamma_{n+1} - \gamma_{n-1} = O(h^5) \cdot \max_{t\in J}\{|f^{(5)}(t)|\}. $$ One can be even more precise if needed. This implies $\gamma_i = O(h^4)$, $0 \le i \le n$. To see this consider the index $j$, where $|\gamma_j|$ is the largest, and consider separately the cases $\gamma_j \ge 0$ and $\gamma_j < 0$ (if $j$ is on the boundary, then by the boundary conditions there is value close to $\gamma_j$ inside).
If $f$ is strictly decreasing on $J$ (that is, for some $a < 0$ we have $f^\prime < a$ on this interval), then $\beta_i$ will also be decreasing for $0 \le i \le n$ (if $n$ is sufficiently large, depending on $a$ and on the upper bound for the derivatives of $f$).