Does a pointwisely convergent sequence of quadratic functions pointwisely converges to a quadratic one?

848 Views Asked by At

Let $\{f_j\}_{j=1}^\infty$ be a sequence of functions $f_j:\mathbb{R}^n \to \mathbb{R}$ quadratically parameterized as $f_j(x) = x^T P_j x$ for a symmetric matrix $P_j \in \mathbb{R}^{n \times n}$. Let $f_j$ pointwisely converges to a function $f$.

Then, can we say that $f$ is also quadratic so that $f(x) = x^T P x$ for some symmetric $P \in \mathbb{R}^{n \times n}$?


I was particularly interested in the case: $f_j$ is monotonically increasing and bounded by a quadratic function $g(x) = x^T Q x$: \begin{equation} f_1(x) \leq \cdots \leq f_j(x) \leq f_{j+1}(x) \leq \cdots \leq g(x) < \infty \;\;\, \forall x \in \mathbb{R}^n. \end{equation} In that case, by MCT, $f_j \to f$ pointwisely for some $f$, and we also have $P_j \to P$ for some symmetric $P \in \mathbb{R}^{n \times n}$, by MCT again, due to \begin{equation} P_1 \leq \cdots \leq P_j \leq P_{j+1} \leq \cdots \leq Q. \end{equation} In this case, $f(x) = x^T P x$ is true since $x^T P_j x \to x^T P x$ for all $x \in \mathbb{R}^n$. Moreover, if $f(x) = x^T P x$ is true, then it is trivially continuous, so that the convergence is uniform on any compact $\Omega \subset \mathbb{R}^n$ by Dini's theorem.

However, I can't see whether $f(x) = x^T P x$ is true or not in general case without monotonicity.

2

There are 2 best solutions below

2
On BEST ANSWER

You may use a polarization identity which gives us $$B_j(x,y)=x^T P_j y = \frac{1}{2}\left(f_j(x+y)-f_j(x-y)\right)$$ This determines a unique symmetric bilinear form in $x,y$. The hypothesis is that $f_j$ converges pontwise so from the above identity: $$ B(x,y) = \lim_{j\rightarrow \infty} B_j(x,y)$$ exists for all $x,y$. One checks it is bilinear in $x$ and $y$, so has the form $x^T P y$. One recovers then that $f(x)=\lim_j f_j(x)=B(x,x)=x^T P x$.

8
On

The result for $n=1$ is straightforward. Suppose we know the result for $\mathbb R^n,$ and that $Q_j(x,y)$ is a sequence of quadratics on $\mathbb R^{n+1}$ that converges pointwise on $\mathbb R^{n+1}.$ Here $x= (x_1,\dots, x_n).$ We can write

$$Q_j(x,y) = R_j(x) + a_jy + y(b_{j1}x_1 + \cdots +b_{jn}x_n) + c_jy^2.$$

Here $R_j$ is a quadratic on $\mathbb R^n,$ and the coefficients of $Q_j$ are those of $R_j$ along with the coefficients involving $y$ indicated above.

Looking at $(x,0),$ the induction hypothesis shows the all sequences of coefficients of $R_j(x)$ converge, hence $R_j(x)$ converges pointwise on $\mathbb R^n.$ Thus $Q_j-R_j$ converges pointwise on $\mathbb R^{n+1}.$ If we now look at the variables $(0,x_1,\dots, x_n,y),$ the induction hypotheses tells us all remaining coefficent sequences converge, except for $b_{j1}.$ But now this last coefficient sequence is forced to come along for the ride, and we have the result.