If a sequence of quadratic forms converges pointwise, then does the associated sequence of matrices also converge?

60 Views Asked by At

Consider the sequence of functions $f_k : \mathbb N \times \mathbb R^n \to [0,\infty)$ defined as $f_k(x) = x^TP_kx$, where $P_k : \mathbb N \to \mathbb S_+^n$ is a sequence of symmetric and positive semi-definite $n \times n$ matrices. If $f_k$ converges pointwise to the function $f : \mathbb R^n \to [0,\infty)$ defined as $f(x) = x^TPx$, where $P \in \mathbb S_+^n$, then is it true that $P_k$ converges to $P$? I think that the answer is "yes", but I'm not sure how to complete the following proof:

To show that $P_k \to P$, we want to show that $\forall \varepsilon_2 > 0, \exists N_2 \in \mathbb N,k \geq N_2 \implies \|P_k - P\|_F < \varepsilon_2$, where $\|\cdot\|_F$ is the Frobenius norm, such that $\|P_k - P\|_F = \sqrt{\sum_{i=1}^n\sum_{j=1}^n \left|p_{ij}^{[k]} - p_{ij}\right|^2}$, where $p_{ij}^{[k]}$ is the $(i,j)$ element of $P_k$ and $p_{ij}$ is the $(i,j)$ element of $P$.

Because $f_k = x^TP_kx$ converges pointwise to $f = x^TPx$, then $\forall x \in \mathbb R^n, \forall \varepsilon_1 > 0, \exists N_1 \in \mathbb N, k \geq N_1 \implies |x^T(P_k - P)x| < \varepsilon_1$. So, pick $x$ to be a vector of all $1$'s, such that $$ \begin{align} |x^T(P_k - P)x| &= \left|\sum_{i=1}^n \sum_{j=1}^n (p_{ij}^{[k]} - p_{ij})\right| \\ &\leq \sum_{i=1}^n \sum_{j=1}^n \left|p_{ij}^{[k]} - p_{ij}\right| \end{align} $$ However, I'm not sure how to then pick $\varepsilon_1$ (to get the corresponding $N_1$) and then relate $|x^T(P_k - P)x|$ to $\|P_k - P\|_F$ above via $\varepsilon_2$ to get the corresponding $N_2$, so I would appreciate any hints.

2

There are 2 best solutions below

5
On BEST ANSWER

Consider $f_k(e_i)$ where $e_i$ is the $i$th unit vector which turns out to be equal to $p_{i,i}^{(k)}$. Thus by assumtion $\lim_k p_{i,i}^{(k)}=\lim_kf_k(e_i)=f(e_i)=p_{i,i}$. Next for $i\not=j$ consider $f_k(e_i+e_j)$ which equals $2p_{i,j}^{(k)}$, and argue as above.

0
On

For the sake of clarity, here is a complete answer based on @JensSchwaiger's answer.

Because $f_k \to f$ pointwise, then $f_k(e_i) \to f(e_i)$, where $e_i$ is the vector of all $0$'s except the $i$th position, where there is a $1$ (i.e. $e_i$ is the $i$th standard basis vector). That is, $$ \begin{align} \lim_{k \to \infty} f_k(e_i) = f(e_i) &\implies \lim_{k \to \infty} e_i^TP_ke_i = e_i^TPe_i = p_{ii} \\ \end{align} $$ Similarly, because $f_k \to f$ pointwise, then $f_k(e_i + e_j) \to f(e_i + e_j)$, such that $$ \begin{align} \lim_{k \to \infty} f_k(e_i + e_j) = f(e_i + e_j) &\implies \lim_{k \to \infty} e_i^TP_ke_i + 2e_i^TP_ke_j + e_j^TP_ke_j = e_i^TPe_i + 2e_i^TPe_j + e_j^TPe_j \\ &\implies \lim_{k \to \infty} p_{ii}^{[k]} + 2p_{ij}^{[k]} + p_{jj}^{[k]} = p_{ii} + 2p_{ij} + p_{jj} \\ &\implies \lim_{k \to \infty} p_{ii}^{[k]} + \lim_{k \to \infty} 2p_{ij}^{[k]} + \lim_{k \to \infty} p_{jj}^{[k]} = p_{ii} + 2p_{ij} + p_{jj} \\ &\implies p_{ii} + \lim_{k \to \infty} 2p_{ij}^{[k]} + p_{jj} = p_{ii} + 2p_{ij} + p_{jj} \\ &\implies \lim_{k \to \infty} 2p_{ij}^{[k]} = 2p_{ij} \\ &\implies \lim_{k \to \infty} p_{ij}^{[k]} = p_{ij} \end{align} $$ Therefore, $\forall (i,j) \in \{1,\dots,n\}^2, \forall \varepsilon_3 > 0,\exists N_3^{[ij]} \in \mathbb N, k \geq N_3^{[ij]} \implies \left|p_{ij}^{[k]} - p_{ij}\right| < \varepsilon_3$. For convenience, for every $\varepsilon_3 > 0$, let $N_3 = \max\left\{N_3^{[ij]} \mid (i,j) \in \{1,\dots,n\}^2\right\}$ (note that $N_3^{[ij]}$ is implicitly a function of $\varepsilon_3$ for each $(i,j)$), such that $\forall \varepsilon_3 > 0, \exists N_3 \in \mathbb N, \forall (i,j) \in \{1,\dots,n\}^2, k \geq N_3 \implies \left|p_{ij}^{[k]} - p_{ij}\right| < \varepsilon_3$.

Finally, the proof proceeds as follows. Fix any $\varepsilon_2 > 0$ and choose $\varepsilon_3 > 0$ such that $$ \varepsilon_3 < \sqrt{\frac{1}{n^2} \varepsilon_2^2} $$ and get the corresponding $N_3 \in \mathbb N$. Then, let $N_2 \geq N_3$ and suppose that $k \geq N_2$, such that $$ \begin{align} \|P_k - P\|_F &= \sqrt{\sum_{i=1}^n\sum_{j=1}^n \left|p_{ij}^{[k]} - p_{ij}\right|^2} \\ &< \sqrt{\sum_{i=1}^n\sum_{j=1}^n \left(\sqrt{\frac{1}{n^2} \varepsilon_2^2}\right)^2} \\ &= \sqrt{\sum_{i=1}^n\sum_{j=1}^n \frac{1}{n^2} \varepsilon_2^2} \\ &= \sqrt{\frac{1}{n^2}\varepsilon_2^2 \sum_{i=1}^n\sum_{j=1}^n 1} \\ &= \sqrt{\frac{1}{n^2}\varepsilon_2^2 n^2} \\ &= \sqrt{\varepsilon_2^2} \\ &= \varepsilon_2 \end{align} $$ as desired.