There is theorem in linear algebra. I forgot it!! But I remember something from it. Can you please give me a reference?
It is related to something like this. If I have two polynomials $P(X)=\alpha_pX^p+\alpha_{p-1}X^{p-1}+\dotsc+\alpha_0$ and $Q(X)=\beta_qX^q+\beta_{q-1}X^{q-1}+\dotsc+\beta_0$. If $P(X)=Q(X)$ and $p=q$ then $\alpha_p=\beta_q$, $\alpha_{p-1}=\beta_{q-1}$, $\dotsc$, $\alpha_0=\beta_0$. I remember something like this. Please can you give a name, reference?
Also with $\cos x$ and $\sin x$: $a\cos x+b\sin x=c\cos x+d\sin x$ then $a=c$ and $b=d$.
It is a theorem that is (I think) related to basis in vector space and something like this.
Yes, this can be expressed in the language of linear algebra, as you seem to desire. Namely, over any infinite field $\,F\,$ the polynomials $\,\{X^i\}\,$ are basis of the polynomial function ring $\,K[X],\,$ hence $\ c+ aX = \langle c,a,0,0\ldots\rangle = \langle d,b,0,0\ldots\rangle = d+b X\,$ iff they have the same coefficients, i.e. $\, c=d,\, a=b.\,$
Similarly $\,\sin(x),\, \cos(x)\,$ are a basis for the solution space of of $\,y'' + y = 0\,$ (they are linearly independent by the Wronskian criterion) so an analogous statement holds.
Remark $\ $ This can fail over finite fields, e.g. by little Fermat $\,X^p = X\,$ as functions on $\,\Bbb F_p = \Bbb Z/p.\,$ However, it remains true for the ring of formal polynomials $\,R[x]\,$ (vs. polynomial functions), since, by definition, two formal polynomials are equal iff their corresponding coefficients are equal.