Let $\{f(n)\}_{n=1}^{\infty}$ denote the Fibonacci sequence defined by $f(1)=1, f(2)=1$, and $f(n)=f(n-1)+ f(n-2)$ for all $n\geq 3$.
Let $α=\dfrac{1+\sqrt{5}}{2}$ and $β=\dfrac{1-\sqrt{5}}{2}.$ Prove that $f(n)=\dfrac{α^n - β^n}{α-β}$ for all $n \in \mathbb{N}$.
So I'm using the Principle of Complete Induction for this problem but I'm slightly confused on the induction step. So for the base case I would should $f(1)=1$, which is true. For the induction step, would I fix $n$ at $3$? I'm not sure where to go from here. Any help is appreciated.
The base hypothesis is
$$f_1=\frac{\alpha^{1}-\beta^{1}}{\alpha-\beta}=1,$$ $$f_2=\frac{\alpha^{2}-\beta^{2}}{\alpha-\beta}=\alpha+\beta=1,$$ as the sum of the roots of the characteristic equation is the opposite of the coefficient of $x$.
Then by the induction hypothesis,
$$f_n+f_{n+1}=\frac{\alpha^{n+1}-\beta^{n+1}}{\alpha-\beta}+\frac{\alpha^{n}-\beta^{n}}{\alpha-\beta}=\frac{(\alpha+1)\alpha^{n}-(\beta+1)\beta^{n}}{\alpha-\beta}=\frac{\alpha^{n+2}-\beta^{n+2}}{\alpha-\beta}=f_{n+2},$$ as both roots are such that $x+1=x^2$.