Determine polynomials with $n$-variables

136 Views Asked by At

Here is a funny problem arise from harmonic analysis: Let $E$ be a measurable subset of $\mathbb R^n$ with $m(E)>0$, where $m$ is the usual Lebesgue measure on $\mathbb R^n$. In practice, $E$ usually be a ball or torus. Let $s$ be a positive integer, and $\alpha$ is a multi-index $\alpha=(\alpha_1,\ldots,\alpha_n)\in\mathbb N^n$ such that $|\alpha|=\alpha_1+\cdots+\alpha_n\leq s$. Here $\mathbb N$ is the set of all non-negative integers.

Proposition. Then there exists an uniquely a polynomial $\phi(x)$ on $\mathbb R^n$ with its degree not greater than $s$ and such that $\frac1{m(E)}\int\limits_E \phi(x) x^\beta dx=\delta_{\alpha,\beta}$ for every multi-index $\beta\in\mathbb N^n$ such that $|\beta|\leq s$.

Let me explain some notations: $\delta_{\alpha,\beta}=\begin{cases} 1\quad\text{if}\;\beta=\alpha&\\ 0\quad\text{if}\;\beta\neq\alpha.\end{cases}$ A polynomial $P$ on $\mathbb R^n$ with degree $d$ is a function in form $P(x)=\sum\limits_{|\gamma|=0}^dc_{\gamma}x^\gamma$. Here $\gamma=(\gamma_1,\ldots,\gamma_n)\in\mathbb N^n$, $x_\gamma=x_1^{\gamma_1}\cdots x_n^{\gamma_n}$ and $c_{\gamma}=c_{\gamma_1,\ldots,\gamma_n}$ are real numbers.

I try to prove this proposition by using linear algebra, but I fail because I could not to prove vectors are linear independent. Does any one help to prove this. Thanks in advance!.

2

There are 2 best solutions below

1
On BEST ANSWER

Let $\varphi(x)=\sum_{|\beta|\le s}r_\beta x^\beta$ and $$ c_{\alpha}=\frac{1}{m(E)}\int_E\varphi(x)\,x^\alpha\,dx =\frac{1}{m(E)}\sum_{|\beta|\le s}r_\beta\int_E x^{\alpha+\beta}, \quad |\alpha|\le s. $$ This is a linear system of $\binom{n+s}{s}$ equations with $\binom{n+s}{s}$ unknowns (the $r_\beta$'s), and system matrix $$ A=\left(\begin{matrix}\frac{1}{m(E)}\int_E x^{\alpha+\beta}dx\end{matrix}\right)_{|\alpha|,|\beta|\le s}. $$ Our system has a unique solution iff $A$ is nonsingular. If $A$ was singular, then there would be a vector $w=(w_\alpha)_{|\alpha|\le s}$, such that $w^tAw=0$. Set $q(x)=\sum_{|\beta|\le s}w_\beta x^\beta$. Then $$ 0=w^tAw=\cdots=\frac{1}{\big(m(E)\big)^2}\int_E q^2(x)\,dx. $$ But this would imply that $q(x)=0$, a.e. on $E$, which could happen only if $w_\beta=0$, for all $\beta$.

0
On

The collection of polynomials of degree at most $s$ is a finite dimensional vector space, and the pairing $(\phi, psi)=\frac{1}{m(E)}\int_E \phi(x)\psi(x)dx$ is a positive definite (hence non-degenerate) bilnear product on this vector space.

The result follows immediately from the fact that every subspace of dimension $k$ in a finite dimensional inner product space will have an orthogonal complement which is of codimension $k$.

More explicitly, let $V$ be the subspace spanned by $x^{\beta}$ with $|\beta|\leq s, \beta\neq \alpha$. This is a codimension $1$ subspace, and so has a one dimensional orthogonal complement $V^{\perp}=\{w\mid (w,v)=0 \;\forall v\in V\}$. Let $\psi$ be a nonzero element of this complement. Let $(\psi,x^{\alpha})=c$. If $c=0$, then $\psi$ would be orthogonal to every element in a basis of our vector space, and by non-degeneracy would be zero. Hence, $c\neq 0$ and we can take $\phi=(1/c)\psi$.