Limiting distribution of $\sum_{i,j} c_{ij} v_{j}u_{ij}$ where $c_{ij}$ are iid Bernoulli, $v_j$'s are iid $N(0,1/n)$, $u_{ij}$'s are iid $N(0,1/m)$

65 Views Asked by At

Let $m$ and $n$ be large positive integers and let $p \in (0, 1)$. Let $U = (u_{ij})_{i,j}$ be an $n \times m$ random matrix with iid entries from $N(0,1/m)$ and let $v = (v_j)_j \in \mathbb R^{n}$ be a random vector with iid entries from $N(0,1/n)$, each on independent of $U$. Let $I := \{(i,j) \mid 0 \le i \le m,\;0 \le j \le n\}$ and consider the sum $$ S := \sum_{(i,j) \in I} c_{ij} v_{j}u_{ij}, $$ where $(c_{ij})_{i,j}$ is an $n \times m$ random matrix with iid Bernoulli entries with parameter $p$, each entry independent of $U$ and $v$.

Question. In the limit $m, n\to \infty$, what is the asymptotic distribution of $S$ ?

Solution in the case where $m=1$ and $u_{i,j} \equiv 1$

In this case, we have $S = \sum_{(i,j)}c_{j}v_j = \sum_{j=1}^{N_c}\widetilde{v}_j$, where $N_c := \#\{ j \mid c_j = 1\}$ and $\widetilde{v}$ is a reordering of the coordinates of $v$ in decreasing order. By Donsker's theorem, in the limit $n \to \infty$, conditioned on the event $N_c = k$, $n^{-1/2}S$ converges in distribution to Brownian motion $W(t)$, where $t = \min\{0 \le t \le 1 \mid k \le tn\}$, i.e $\lfloor nt\rfloor = k$.

1

There are 1 best solutions below

2
On BEST ANSWER

I've left calculating some integrals for you (those are probably known, that is characteristic function of gaussian and scalled poisson integral ( $ \mathbb E[\exp(-a^2G^2)] $ where $G$ is gaussian)

Case $m=1, u_{i,j}=1$

We get $S_n = \sum_{k=1}^n c_k v_k$. Looking at c.f of $S_n$ we have by independence:

$$\varphi_n(t) = (\mathbb E[\exp(itc_1v_1)])^n $$

And again by independence $$ \mathbb E[\exp(itc_1v_1)] = \mathbb E[ \mathbb E[\exp(itv_1c_1) | c_1]] = \mathbb E[ \exp(-\frac{t^2c_1^2}{2n})] = (1-p) + p\exp(-\frac{t^2}{2n}) $$

Hence $$ \varphi_n(t) = ( 1 - p(1 - \exp(-\frac{t^2}{2n}))^n = (1-p\frac{t^2}{2n} + o(\frac{1}{n}))^n \to \exp(-\frac{pt^2}{2})$$ Which by Levy means that $S_n$ converges to $\mathcal N(0,p)$

Case for $n,m$, but assuming that instead of vector $V$ we have a matrix $V=(v_{i,j})$ of independent entries (and independent from $U,c$).

Let $S_{n,m}$ be that partial sum and $\psi_{n,m}$ its cf.

We're try similarly with calculating it.

$$ \psi_{n,m}(t) = (\mathbb E[ \exp(itc_{1,1}v_{1,1}u_{1,1})])^{nm} $$

And again $$ \mathbb E[\exp(itc_{1,1}v_{1,1}u_{1,1})] = \mathbb E[ \mathbb E[\exp(itc_{1,1}u_{1,1}v_{1,1} | c_{1,1},u_{1,1}]] = \mathbb E[\exp(-\frac{t^2u_{1,1}^2c_{1,1}^2}{2n})] $$

And again conditioning:

$$ \mathbb E[\exp(-\frac{t^2u_{1,1}^2c_{1,1}^2}{2n})] = \mathbb E[\mathbb E[\exp(-\frac{t^2u_{1,1}^2c_{1,1}^2}{2n}) | c_{1,1} ]] = \mathbb E [ \frac{1}{\sqrt{\frac{t^2c_{1,1}^2}{nm} + 1}}] = 1 - p + p \frac{1}{\sqrt{1+\frac{t^2}{mn}}} $$

so that:

$$\psi_{n,m}(t) = \Big (1 - p(1 - (1 + \frac{t^2}{nm})^{-\frac{1}{2}}) \Big)^{nm} = \Big(1 - p\frac{t^2}{2nm} + o(\frac{1}{n^2m^2}) \Big)^{nm} \to \exp(-\frac{pt^2}{2})$$ (limit since as $n,m \to \infty$ then $nm \to \infty$ and you can substitute $k=nm$ and let $k \to \infty$). So again $S_{n,m} \Rightarrow \mathcal N(0,p)$

Last case with only $V=(v_j)$

This would be harder, since not every entries are independent. Consider $T_m(j) = \sum_{i=1}^m c_{i,j}v_j u_{i,j}$. Then $S_{n,m} = \sum_{j=1}^n T_m(j)$. Moreover $T_m(1),...,T_m(n)$ are independent, so that $$ \varphi_{n,m}(t) = (\mathbb E[\exp(it T_m(1))])^n $$ Now conditioning on $v_1$ we'll get $$ \varphi_{n,m}(t) = \Big( \mathbb E[ (1 - p + p\exp(-\frac{t^2 v_1^2}{2m}))^m ] \Big)^n $$

And here we have a problem, since for fixed $m$, random variable $(1-p + p \exp(-\frac{t^2 v_1^2}{2m}))^m$ is less than $1$ almost surely, so that it's expectation is less than $1$, too. So if we look at iterated limit $$\lim_m \lim_n \varphi_{n,m}(t)$$ then we get $0$ for $t \neq 0$ and $1$ for $t=0$. But, if we look at iterated limit $\lim_n \lim_m \varphi_{n,m}(t)$ then we'll get something else. Indeed, note that $\lim_m \mathbb E[(1-p + p \exp(-\frac{t^2v_1^2}{2m}))^m] = \mathbb E[ \exp(-\frac{t^2v_1^2p}{2}]$ By dominated convergence theorem (we saw above, rv is bounded by $1$ and pointwise limit is not that hard to compute (we've already computed it above without term $v_1^2$). But $\mathbb E[\exp(-\frac{t^2v_1^2 p}{2})] = (1 + \frac{t^2p}{n})^{-\frac{1}{2}}$ so that $$\lim_n \lim_m \varphi_{n,m}(t) = \exp(-\frac{t^2}{2}p)$$

To conclude $$\lim_{n \to \infty} \lim_{m \to \infty} S_{n,m} \Rightarrow \mathcal N(0,p)$$ but the limit $\lim_m \lim_n$ does not exists in the sense of distribution.