Expected value in probabilistic method

147 Views Asked by At

I am asked to prove the following statement:

For every connected, undirected and simple graph $G=(V,E)$, one can can always find a vertex $v\in V$ such that $$\sum_{x\in\Gamma (v)}\frac{\deg(x)}{\deg(v)}\geqslant \frac{\sum_{x\in V}\deg(x)^2}{2\lvert E\rvert}$$ where $\lvert E\rvert$ denotes the number of edges and $\Gamma(v)$ the neighborhood of $v$.

When I first saw the problem, I thought it might be a good example where to apply the probabilistic method (I have recently been reading about it, but have actually never used it before).

My idea was to choose a vertex $w\in V$ at random (iid uniform), and to consider \begin{align*}X:V&\longrightarrow \{0,1\}\\ v&\longmapsto \begin{cases}1, &\text{if } v\in \Gamma(w)\\0, &\text{otherwise}\end{cases} \end{align*} As well as \begin{align*} Y:V&\longrightarrow \{0,1\}\\ v&\longmapsto \begin{cases}1, &\text{if $v=w$}\\0, &\text{otherwise}\end{cases} \end{align*} (I am not very confident about this election). Next, we will determine the following expected value: \begin{align*} \mathbb{E}\left[\frac{\sum_{v\in V}X(v)\cdot \deg(v)}{\sum_{v\in V}Y(v)\cdot \deg(v)}\right]&\stackrel{{\color{red}{?}}}{=} \frac{\mathbb{E}\left[\sum_{v\in V}X(v)\cdot \deg(v)\right]}{\mathbb{E}\left[{\sum_{v\in V}Y(v)\cdot \deg(v)}\right]}\\ &= \frac{\sum_{v\in V}\mathbb{E}[X(v)]\cdot \deg(v)}{\sum_{v\in V}\mathbb{E}[Y(v)]\cdot \deg(v)}\\ &= \frac{\sum_{v\in V}\frac{\deg(v)}{\lvert V\rvert}\cdot \deg(v)}{\sum_{v\in V}\frac{1}{\lvert V\rvert}\cdot \deg(v)}\\ &= \frac{\sum_{v\in V}\deg(v)^2}{2\lvert E\rvert} \end{align*} If the calculations are right (what I am not very sure about), we would be done.

Problem. I somehow feel uncomfortable with the definition of $X,Y$ and the way the expected value is applied here, but cannot come up with anything better. Somehow, I feel this lacks too much formality…

And, I am not very sure if the identity $\color{red}{?}$ holds. As far as I can remember, $\mathbb{E}[AB]=\mathbb{E}[A]\mathbb{E}[B]$ holds if $A,B$ are not correlated, but I am not sure on how to confirm if this is the case here.


Edit. The identity $\color{red}{?}$ does not seem to hold due to dependency. I also noticed that $$\sum_{v\in V}Y(v)\cdot \deg(v)=\sum_{v\in V}X(v),$$ so we do not really need to introduce $Y$.

1

There are 1 best solutions below

2
On BEST ANSWER

$\def\Γ{{\mit Γ}}\def\paren#1{\left(#1\right)}\def\Ω{{\mit Ω}}$Suppose $U$ is a random variable on $V$ with $P(U = v) = \dfrac{\deg(v)}{2|E|}$ for $v \in V$, then\begin{gather*} E\paren{ \frac{1}{\deg(U)} \sum_{x \in \Γ(U)} \deg(x) } = \sum_{v \in V} \paren{ \frac{1}{\deg(v)} \sum_{x \in \Γ(v)} \deg(x) } \cdot \frac{\deg(v)}{2|E|}\\ = \frac{1}{2|E|} \sum_{v \in V} \sum_{x \in \Γ(v)} \deg(x) = \frac{1}{2|E|} \sum_{x \in V} \deg(x) \sum_{v \in \Γ(x)} 1 = \frac{1}{2|E|} \sum_{x \in V} (\deg(x))^2. \end{gather*} Since $V$ is finite, there exists $v \in V$ such that$$ \frac{1}{\deg(v)} \sum_{x \in \Γ(v)} \deg(x) \geqslant \frac{1}{2|E|} \sum_{x \in V} (\deg(x))^2. $$


A rigorous construction for $U$ along with the probability space on which it is defined: Choose $\Ω = V$, $\mathscr{F} = 2^V$, and define$$ P(A) = \frac{1}{2|E|} \sum_{v \in A} \deg(V) \quad \forall A \in \mathscr{A}, $$ then $(\Ω, \mathscr{F}, P)$ is a probability space. Now define $U(ω) = ω$ for $ω \in \Ω$, then $U$ is a random variable on $(\Ω, \mathscr{F}, P)$ with the desired distribution.