I have $n$ univariate polynomials $f_1(X)$, $f_2(X)$, ..., $f_n(X)$. I do not know if they are linearly dependent or independent.
I need to combine them to form a new polynomial $g$ such that if $g$ evaluates to $0$ at some $X$, it would guarantee that all the $f_i$'s are also $0$ at that value of $X$.
The only way I can think of is to turn it into a bivariate polynomial.
i.e. $g(X,Y) = f_1(X) + Yf_2(X) + ... + Y^nf_n$
- However, I am still not 100% convinced if this combination guarantees that if $g = 0$, then all the $f_i$'s are $0$.
I think so because of this Q & A -Linear dependence of $\left\{x^{n}\,\colon\, n\in\mathbb{N}\right\}$
In my case, for any constant $c$, $g(X=c,Y)$ will become of the form $c_1 + c_2Y + c_3Y^2 + ... + c_nY^n$ which as per that Q&A can only be $0$ if all the $c_i$'s are $0$. So my goal would be achieved.
Am I correct?
If (1) is true, then is there any other easier way to combine to guarantee the same - i.e. a way where $g$ is still univariate?
If (1) is not true, then what is the way to combine them to achieve this?
Given polynomial $f(x)$, $f(c)=0 \iff f(x)=(x-c)g(x) $ where $g(x)$ is a polynomial one degree lower than $f$. So if $f_i(c)=f_j(c)=0,$ then $(x-c)$ divides both $f_i(x)$ and $f_j(x)$.
You can go in the reverse direction. If you have linearly independent polynomias $g_i(x)$, then have $f_i(x)=(x-c)g_i(x)$. Now your $f_i$s are all 0 at the same place and linear independent. Can you show that?