Can a polynomial also be a vector?

208 Views Asked by At

To the best of my knowledge, a vector is defined simply as a member of a particular linear space. Since we can construct a linear space composed out of polynomial expressions above the real field, I conclude that a polynomial from that vector space is a vector — because that's how we defined it.

What I struggle to understand is what happens when we start working with linear transformations. Assuming I have a linear operator $T: \mathbb{R}_n[x]\rightarrow \mathbb{R}_n[x]$, its representing matrix will allow for $T(v) = Av$, where $A$ is the representing matrix and $v \in \mathbb{R}_n[x]$. But how do we multiply a matrix by a polynomial, unless we treat it like a vector?

To help clarify what I'm trying to ask: I've been told that when finding the eigenvectors of $A$ for the above transformation, I must transform the vector coordinates into the polynomial field. But, if a polynomial isn't a vector, it sounds wrong to me to do so.

3

There are 3 best solutions below

0
On BEST ANSWER

You say $T(v)=Av$. This is wrong. You should say $$[T(v)]_{B'}=A[v]_{B}$$

where $A$ is the matrix of $T$ with respect to the basis $B$ of the domain and the basis $B'$ of the co-domain. Here $[T(v)]_{B'}$ is the coordinate vector with respect to $B'$ and $[v]_{B}$ is the coordinate vector with respect to $B$.

So instead of multiplying the matrix with a polynomial, the effect of the linear transformation is captured by multiplying the matrix with the coordinate vector (i.e. the column matrix) of the polynomial.

0
On

You don't need a matrix to do that. If, say, $T(P)(x)=P(x+1)$, then $T$ is linear, but you can compute $T(P)$ for any polynomial without using a matrix. You may use a matrix, but it is not essential.

0
On

You have put your attention on a very interesting subject, that here can be only summarily surveyed.

Well, if we define the vector $$ {\bf x}_{\,h} (x) = \left( {\matrix{ {x^{\,0} } \cr {x^{\,1} } \cr \vdots \cr {x^{\,h} } \cr } } \right) $$ then any polynomial, of degree not greater than $h$, can be expressed as: $$ p_{\,h} (x) = {\bf a}_{\,h} \cdot {\bf x}_{\,h} (x) = \overline {{\bf a}_{\,h} } \,{\bf x}_{\,h} (x) $$ thereby establishing, in base ${\bf x}_{\,h}$, an isomorphism between the polynomials and the coefficient vectors ${\bf a}_{\,h}$.

This correspondence is useful in many sectors of polynomial analysis.

One is the change of basis. For instance if we take as basis the Rising Factorials of $x$ $$ {\bf y}_{\,h} (x) = \left( {\matrix{ {x^{\,\overline {\,0\,} } = 1} \cr {x^{\,\overline {\,1\,} } = x} \cr \vdots \cr {x^{\,\overline {\,h\,} } = \prod\limits_{k = 0}^{h - 1} {\left( {x + k} \right)} } \cr } } \right) $$ then we know that the relation between the bases is given by the Stirling N. of 1st kind $$ {\bf y}_{\,h} (x) = \left( {\matrix{ {x^{\,\overline {\,0\,} } } \cr {x^{\,\overline {\,1\,} } } \cr \vdots \cr {x^{\,\overline {\,h\,} } } \cr } } \right) = \left( {\matrix{ 1 & 0 & \cdots & 0 \cr 0 & 1 & \cdots & 0 \cr \vdots & \vdots & \ddots & \vdots \cr {\left[ \matrix{ h \cr 0 \cr} \right]} & {\left[ \matrix{ h \cr 1 \cr} \right]} & \cdots & {\left[ \matrix{ h \cr h \cr} \right]} \cr } } \right)\left( {\matrix{ {x^{\,0} } \cr {x^{\,1} } \cr \vdots \cr {x^{\,h} } \cr } } \right) = {{\bf S}_{\,{\bf t}\,1}} _{\,h} \,{\bf x}_{\,h} (x) $$

and we can easily get the conversion of the coefficients as $$ p_{\,h} (x) = \overline {{\bf a}_{\,h} } \,{\bf x}_{\,h} (x) = \left( {\overline {{\bf a}_{\,h} } \,{{\bf S}_{\,{\bf t}\,1}} _{\,h} ^{\, - \,1} } \right)\;{\bf y}_{\,h} (x) $$

Also we have $$ {\bf x}_{\,h} (x + 1) = \left( {\matrix{ 1 & 0 & \cdots & 0 \cr 1 & 1 & \cdots & 0 \cr \vdots & \vdots & \ddots & \vdots \cr {\left( \matrix{ h \cr 0 \cr} \right)} & {\left( \matrix{ h \cr 0 \cr} \right)} & \cdots & {\left( \matrix{ h \cr 0 \cr} \right)} \cr } } \right)\,\;{\bf x}_{\,h} (x) = {\bf B}_{\,h} \;{\bf x}_{\,h} (x) $$

Another important field is that of polynomial interpolation, since we can construct the vector equation involving a Vandermonde matrix as follows $$ \eqalign{ & p_{\,h} (x) = \overline {{\bf a}_{\,h} } \,{\bf x}_{\,h} (x)\quad \Rightarrow \cr & \Rightarrow \quad \left( {p_{\,h} (x_{\,0} ),p_{\,h} (x_{\,1} ), \cdots ,p_{\,h} (x_{\,h} )} \right) = \overline {{\bf a}_{\,h} } \left( {{\bf x}_{\,h} (x_{\,0} ),{\bf x}_{\,h} (x_{\,1} ), \cdots ,{\bf x}_{\,h} (x_{\,h} )} \right)\quad \Rightarrow \cr & \Rightarrow \quad \cdots \cr} $$ and you can well figure out the developments that can be attained.

However, the matching between polynomials and vectors is lost in the fundamental field of the product and factorization.
We would need to introduce a vector "product" which provides their convolution.
It would be interesting to ask herewith whether somebody knows about attempts to reconcile polynomial product with vectors.