Why if $aX+c=bX+d$ then $a=b$ and $c=d$?

95 Views Asked by At

There is theorem in linear algebra. I forgot it!! But I remember something from it. Can you please give me a reference?

It is related to something like this. If I have two polynomials $P(X)=\alpha_pX^p+\alpha_{p-1}X^{p-1}+\dotsc+\alpha_0$ and $Q(X)=\beta_qX^q+\beta_{q-1}X^{q-1}+\dotsc+\beta_0$. If $P(X)=Q(X)$ and $p=q$ then $\alpha_p=\beta_q$, $\alpha_{p-1}=\beta_{q-1}$, $\dotsc$, $\alpha_0=\beta_0$. I remember something like this. Please can you give a name, reference?

Also with $\cos x$ and $\sin x$: $a\cos x+b\sin x=c\cos x+d\sin x$ then $a=c$ and $b=d$.

It is a theorem that is (I think) related to basis in vector space and something like this.

5

There are 5 best solutions below

0
On BEST ANSWER

Yes, this can be expressed in the language of linear algebra, as you seem to desire. Namely, over any infinite field $\,F\,$ the polynomials $\,\{X^i\}\,$ are basis of the polynomial function ring $\,K[X],\,$ hence $\ c+ aX = \langle c,a,0,0\ldots\rangle = \langle d,b,0,0\ldots\rangle = d+b X\,$ iff they have the same coefficients, i.e. $\, c=d,\, a=b.\,$

Similarly $\,\sin(x),\, \cos(x)\,$ are a basis for the solution space of of $\,y'' + y = 0\,$ (they are linearly independent by the Wronskian criterion) so an analogous statement holds.

Remark $\ $ This can fail over finite fields, e.g. by little Fermat $\,X^p = X\,$ as functions on $\,\Bbb F_p = \Bbb Z/p.\,$ However, it remains true for the ring of formal polynomials $\,R[x]\,$ (vs. polynomial functions), since, by definition, two formal polynomials are equal iff their corresponding coefficients are equal.

0
On

Because the equality must hold for all values of $x$, if we let $x=0$ then we get $c=d $, then we let $x=1 $ to get $a+c=b +d$ but our previous findings give us that $a=b $. This generalises to polynomials of higher order, you just compare coefficients.

10
On

Fundamental theorem of algebra will help you with the first part: you have two polynomials that are identical on uncountable set (in particular, on $\Bbb R$): $$\forall X\in \Bbb R \quad P(X)=Q(X),$$ hence the difference of these polynomials must be zero (the difference has an uncountable set of zeroes), therefore, each coefficient in $P$ and $Q$ must coincide.

As for the second part, you can reformulate it as "$\cos x$ and $\sin x$" are linearly independent on $\Bbb R$. Indeed, let your equality hold for any $x\in \Bbb R$. Then you can choose $x=0$ to obtain $a=c$ and $x=\pi/2$ to obtain $b=d$.

0
On

Assume $a\neq b,c\neq d$. Then, we can write $a=b+D,c=d+E$. Then, $$bX+DX+d+E=bX+d\\ \implies DX+E=0X+0$$ However, $E$ is constant, and not a function of $X$. Therefore, we need $E=0$, and similarly, we need $D=0$, showing that $a=b,c=d$.

0
On

This is by definition: the polynomials $P(X)=\alpha_pX^p+\alpha_{p-1}X^{p-1}+\dots+\alpha_0$ and $Q(X)=\beta_qX^q+\beta_{q-1}X^{q-1}+\dots+\beta_0$ are equal if and only if $$ p=q, \alpha_0=\beta_0, \alpha_1=\beta_1, \dots, \alpha_p=\beta_p. $$ In the case of polynomials over an infinite field $F$, such as $\mathbb{R}$ or $\mathbb{C}$, this is equivalent to $$ \text{for all $a\in F$},\quad P(a)=Q(a) $$ because in this case $P(X)-Q(X)$ would have infinitely many roots, which cannot be for a non zero polynomial (the converse is obvious). However, since polynomials over finite fields are really useful, for instance in studying error correcting codes or cryptography on elliptic curves, the other definition is preferred.

To see why one shouldn't rely on the identification with polynomial functions, consider a finite field $F=\{a_1,a_2,\dots,a_n\}$ and the polynomial $$ P(X)=(X-a_1)(X-a_2)\dots(X-a_n) $$ which assumes the zero value for every element in $F$, but should definitely be considered non zero.

It's a different case for functions, where the definition is rightfully $$ f=g\text{ if and only if }f(a)=g(a)\text{ for all $a$ in the common domain}. $$ For the case of $$ a\sin x+b\cos x=c\sin x+d\cos x $$ just set $r=a-c$ and $s=b-d$, so you have to prove that, if $$ r\sin x+s\cos x=0, \text{ for all $x\in\mathbb{R}$} $$ then $r=0$ and $s=0$. Simply choose $x=0$ and $x=\pi/2$, which gives $$ \begin{cases} 0=r\sin0+s\cos0=s,\\ 0=r\sin(\pi/2)+s\cos(\pi/2)=r. \end{cases} $$