I have to proof the algebraic formula of the variance without using the linearity of expected values directly. So I can't use $\mathbb{E}(f\mathbb{E}(f)) = \mathbb{E}(f)\mathbb{E}(f)$. My given definition of the expected value is $\mathbb{E}(f) = \sum_{x}p(x)f(x)$.
I started with $$\begin{align} \text{Var}(f) &= \mathbb{E}((f - \mathbb{E}(f))^2)\\ &= \mathbb{E}(f^2 - 2f\mathbb{E}(f) + \mathbb{E}(f)^2)\\ &= \sum_{x}\left(p(x)(f(x)^2 - 2f(x)\mathbb{E}(f) + \mathbb{E}(f)^2)\right)\\ &= \sum_{x}\left(p(x)f(x)^2 - 2p(x)f(x)\mathbb{E}(f) + p(x)\mathbb{E}(f)^2\right)\\ &= \sum_xp(x)f(x)^2 - 2\mathbb{E}(f)\sum_xp(x)f(x) + \mathbb{E}(f)^2\sum_x p(x)\\ &= \mathbb{E}(f^2) - 2\mathbb{E}(f)\mathbb{E}(f) + \mathbb{E}(f)^2 \cdot 1\\ &= \mathbb{E}(f^2) - 2\mathbb{E}(f)^2 + \mathbb{E}(f)^2 = \mathbb{E}(f^2) - \mathbb{E}(f)^2 \end{align}$$
Is this proof correct? The solution idea came to me while writing this question.
E(f(E(f))=∑p(x)f(x)E(f)=E(f)E(f), This is the heart of your proof. Your derivation is correct, but I don't think it can be considered new in any way. The key point is that E(f) is independent of x.
Note that in general, many random variables have continuous distributions, so the sum becomes an integral.