The answer for this question states (without giving too much else away):
Since $F$ and $L$ are linearly independent ...
Using the definition of linear dependence for infinite dimensions, I presume that if I am able to find a finite subset of $L$ and $F$ such that the elements are not all zero and the resulting sum is $0$ the two sets are not linearly independent. I chose $2_{(L)} + (-1)2_{(F)} = 0$.
What is wrong with my reasoning?
A sequence with characteristic polynomial $x^2-x-1$, i.e. a sequence $\{A_n\}_{n\geq 0}$ fulfilling $A_{n+2}=A_{n+1}+A_n$, is fixed by the initial values $A_0$ and $A_1$. We may notice that $$ \det\begin{pmatrix}F_0 & L_0 \\ F_1 & L_1 \end{pmatrix}\neq 0 \tag{1}$$ and by Gaussian elimination $$ \det\begin{pmatrix}F_n & L_n \\ F_{n+1} & L_{n+1} \end{pmatrix}=\pm \det\begin{pmatrix}F_0 & L_0 \\ F_1 & L_1 \end{pmatrix} \tag{2}$$ hence the Fibonacci and Lucas sequences are linearly independent.
We just used the analogue of the Wronskian for DE.