Norm of the sum of random vectors from a unit ball

764 Views Asked by At

Let $x_1,\dots, x_n\in \mathbb{R}^d$ be independently distributed from a uniform distribution on a ball of radius $1$. That is for every $i$: $x_i \sim U(\{x\in \mathbb{R}^d: \|x\|_2\leq 1\})$.

We look at the norm of the sum of the vectors: X = $\left\|\sum_{i=1}^n x_i \right\|_2$. I need the following properties of X:

1) $\mathbb{E}[X] = ?$

2) $Var(X) = ?$

3) Given some $\alpha >0$ what is $P(X \leq \alpha)$?

If the vectors would have been distributed by a standard Gaussian distribution then I could answer these questions by considering the coordinate-wise distribution of the sum, which would also be Gaussian with a mean of $0$ and variance of $\sqrt{n}$. Thus, we could also calculate the mean and variance of the norm of the sum of the vectors, and for the third question the probability would be exponentially small.

Also, if we would consider a uniform distribution on the cube it could be calculated quite easily. But I don't know how to do the same calculation for a uniform distribution on the unit ball (note that this is not the unit sphere, the vectors may have different norms).

1

There are 1 best solutions below

0
On

I will add my 2 cents on the matter, which hopefully someone can finish the appropriate computations. Denote $r_i:=\Vert x_i\Vert_2$ for all $1\leq i\leq n$, and notice that

$$ \mathbb{P}\Big( r_i\leq r \Big)=\frac{ \vert B_{\mathbb{R}^d}(0,r)\cap B_{\mathbb{R}^d}(0,1) \vert }{ \vert B_{\mathbb{R}^d}(0,1) \vert } $$,

which gives us that

$$\mathbb{P}( r_i\leq r)=\cases{0 & ;$r<0$ \\r^{\frac{d}{2}} & ;$0\leq r\leq 1$ \\ 1 &; $r>1$} .$$

By similar considerations,

$$\mathbb{P}( r_i^2\leq r)=\cases{0 & ;$r<0$ \\r^{d} & ;$0\leq r\leq 1$ \\ 1 &; $r>1$} .$$

Notice that $X^2=\sum_{i=1}^n r_i^2$, and is a sum of independent random variables, so convolving the above distribution $n$ times, will give the distribution of $X^2$, which should yield in a relatively simple way the distribution of $X$. Which would solve (3), but I am having trouble doing said convolution.

I should point out, that given that you compute said distribution, the tail-sum formula for expectation gives $\mathbb{E}[X]$ by

$$ \mathbb{E}[X]=\int_0^\infty (1-F_X(t))dt. $$

Using the tail-sum formula and what was computed earlier, we know that

$$ \mathbb{E}[r_i^2]=\int_0^1(1-x^d)dx=\frac{d}{d+1}, $$ and $$ \mathbb{E}[r_i]=\int_0^1(1-x^{\frac{d}{2}})dx=\frac{d}{d+2} .$$