First of all, I am a computer science student, not a maths student. So maybe this is a trivial question, I just would like to understand it :)
Suppose I have the following (pointless) recursive function:
$$ f(x) = a + b \dot{} f(x) $$
for some constants a and b. I suppose that it does not really compute any values, or maybe maps its argument to $\pm\infty$ (except for the cases $b=0$ or $a = 0, b = 1$).
I would like to reason:
$$ f(x) - b \cdot f(x) = a \\ \vdots \\ f(x) = \frac{a}{1 - b} \\ $$
In terms of computation, this seems like a completely different function. So what is wrong about this reasoning (am I not allowed to treat the $=$ as equality? or $f(x)$ as a constant?).