This question is motivated by the idea of Wronskian and independence in Differential Equation course. Let $y_1$ and $y_2$ be two functions and I thought this matrix equation perfectly sums the idea of Wronskian. (In other words, $y_1$ and $y_2$ are linearly dependent if there exist non-trivial $c_1$ and $c_2$.)
$\left[ \begin{array}{c} 0 \\ 0 \end{array} \right] = \begin{bmatrix} y_1 & y_2 \\ y_1' & y_2' \end{bmatrix} \times \left[ \begin{array}{c} c_1 \\ c_2 \end{array} \right]$
I had this proposition in my course that if $y_1(x)$ and $y_2(x)$ are linearly dependent then the Wronskian must be $0$. However I was wondering why does the converse not hold.
I know that the converse partially hold under some assumption, and that there are also counterexamples. However, I do not know where my thinking below has gone wrong.
My thought:
For clarification, the Wronskian, further denoted by $W$, is $W=\begin{bmatrix} y_1 & y_2 \\ y_1' & y_2' \end{bmatrix}$.
I thought about this in a more Linear Algebra way, $W:\mathbb{F}^2\to \mathbb{F}^2$. Assuming $\det(W) =0$ then $W$ is not invertible, then $W$ is not injective and so the kernel is non-trivial and so must admits non-trivial $c_1$, $c_2$ such that the above matrix equation holds and so $y_1$ and $y_2$ must be linearly dependent. Where have I gone wrong in this argument?
Thank you so much!
The problem is that for general functions, your coefficients $c_k$ can depend on $x$, for each $x$ you get a non-trivial kernel vector, but its direction may vary with $x$. This is not what is generally understood as dependence.
It is only in the case that both functions are solutions of the same second order differential equation that you can propagate the dependence coefficient from one initial $x$ to all $x$ in the domain.