Natural isomorphism between $S^k(V^*)$ and polynomial ring explanation

92 Views Asked by At

I am reading page 74 which explains:


Let $S^k(V^*)$ be the space of symmetric $k$-linear functions on a finite dimensional vector space $V$. Then there is an isomorphism to $\Bbb R[x_1,\ldots, x_n]^k$, the space of polynomials homogeneous of degree $k$ given by the map $$P \mapsto \tilde{P}$$ where if we fix a basis $\{e_i \}$ of $V$ then $$\tilde{P}(x_1,\ldots, x_n) := P(v,\ldots, v)$$ with $v = \sum x_i e_i$.


I am having a hard time even making sense of this operation.

How is $\tilde{P}$ even an element of a polynomial ring?

  1. $P$ is regarded as a map $V^k \rightarrow \Bbb R$. So the definition gives an element of $\Bbb R$.

  2. How does $v = \sum x_i e_i$ even live in $V$? $x_i$ are indeterminates..

2

There are 2 best solutions below

0
On BEST ANSWER

Note that evaluation of polynomials over an infinite field (like $\mathbb R$) gives you a bijection \begin{align*} \Phi \colon \mathbb R[x_1,\dots,x_n] &\longrightarrow P_n\subset \operatorname{Maps}(\mathbb R\times\cdots\times\mathbb R, \mathbb R), \\ p &\longmapsto \bigg( (x_1,\dots,x_n) \mapsto p(x_1,\dots,x_n)\bigg), \end{align*} where $P_n$ denotes the set of all polynomial functions $\mathbb R\times\cdots\times\mathbb R\to\mathbb R$.

Hence, to specify an element of $\mathbb R[x_1,\dots,x_n]$ it is enough to specify the corresponding map $\mathbb R\times\cdots\times\mathbb R\to \mathbb R$.

This is how you should understand the definition of $\widetilde P$ in the text. They define a map $\widetilde P\colon \mathbb R\times\cdots\times\mathbb R\to\mathbb R$ given by $$ \widetilde P(x_1,\dots,x_n) = P\bigg(\sum_i x_i e_i, \dots, \sum_i x_i e_i\bigg) $$ and identify it with the corresponding polynomial in $\mathbb R[x_1,\dots,x_n]^k$. You can check that the function $\widetilde P$ is indeed given by a homogeneous polynomial in the $x_i$:

\begin{align*} P\bigg(\sum_i x_i e_i, \dots, \sum_i x_i e_i\bigg) &= P\bigg(\sum_{i_1} x_{i_1} e_{i_1}, \dots, \sum_{i_k} x_{i_k} e_{i_k} \bigg) \\ &= \sum_{i_1,i_2,\dots,i_k} P\bigg(e_{i_1}, \dots, e_{i_k} \bigg) \, x_{i_1}\cdots x_{i_k}. \end{align*}

Now you can just use the coefficients in the last expression to define $\widetilde P$ as an element of $\mathbb R[x_1,\dots,x_n]^k$ to begin with, which works for arbitrary fields then.

0
On

Fix a basis $(e_1,e_2,...,e_n)$ of $V$. The coordinates $x_1$, $x_2,...,x_n$ in the expression $$ v=\sum_{i=0}^nx_i(v)e_i $$ thought as functions on $V$ are but the dual basis elements, i.e. $x_i=e_i^\ast$ and they form a basis of the dual $V^\ast$.

Thus, the case $k=1$ of the question is settled.

In general, $$ S^k(V^\ast)=S^k({\Bbb R}e_1^\ast\oplus\cdots\oplus{\Bbb R}e_1^\ast). $$ The latter space is clearly the space of homogeneous polynomial of degree $k$ in the $e_i^\ast=x_i$.