Consider random vector $\boldsymbol{x} \in \mathbb{R}^n$ drawn uniformly from the $n$-dimensional unit ball. Denote the coordinates of $\boldsymbol{x}$ by $x_1,\dots,x_n$.
A few trivial facts about $\boldsymbol{x}$:
- Every $x_k$ is a scalar random variable with mean $0$.
- Due to symmetry, all $x_k$'s share the same variance, say $\sigma^2$.
- The random variables $x_1,\dots,x_n$ are NOT independent.
My goal is to obtain an analytical upper bound on $\Pr(\sum_{k=1}^n x_k^2 a_k \leq c)$, where $a_k>0$ for $k=1,\dots,n$ and $c>0$ are given constants.
Using a multivariate version of Chebyshev's inequality, I could easily derive: \begin{equation} \Pr(\sum_{k=1}^n x_k^2 a_k \leq c) \leq \frac{1}{c} \sigma^2 \sum_{k=1}^n a_k \,. \end{equation}
Then I use the fact that for the uniform distribution over a ball the variance of each $x_k$ is $\sigma^2=1/(n+2)$ to obtain the final form of my bound.
However, the multivariate Chebyshev's inequality that I use seems to produce a loose bound. I am wondering if there are better ideas to achieve sharper upper bound for the above probability.
While I want the upper bound to be valid for any settings, I am particularly interested in scenarios with large n (which means I do not want to get a trivial bound as $n\rightarrow \infty$).
Thanks
Golabi