Almost constant degree after removing edges in random graph

297 Views Asked by At

Given a graph drawn from $G(n,p)$ with $p=\frac{x_1}{n}$. Let $x_2,x_3$ be given parameters, greater than $0$. I would like to prove that there exists a number $k$ which depends ONLY on $x_1,x_2,x_3$, such than with probability at least $1-x_3$ it is possible to remove no more than $nx_2$ edges, and the remaining graph will have maximum degree which is less than $k$.
My idea is to use the fact that removing all edges $e=(u,v)$ that either $u$ or $v$ has degree greater than $k$ will make sure that the maximum degree is less than $k$. I tried to bound the number of such edges, trying to show that in expectation there are no more than $nx_2$ edges. I am not sure how exactly to prove it, and will be glad for direction.

1

There are 1 best solutions below

0
On BEST ANSWER

I'm worried that removing all edges incident to a high-degree vertex will remove too many edges. The highest degree in $G(n, \frac{x_1}{n})$ is $\mathcal O(\log n)$, not constant. Of course, there are very few vertices of that degree, but you have to know the tail of the degree distribution fairly precisely to be able to say that the sum of all degrees at least $k$ is $\mathcal O(n)$ for sufficiently large $k$.

Instead, why not just remove the excess edges from each vertex?

The degree of fixed vertex $v$ of $G(n, \frac{x_1}{n})$ is asymptotically Poisson with mean $x_1$. In particular, for any constant $k$, we will get some constant probability that $\deg v \ge k$. This can be made arbitrarily small by increasing $k$. A decent upper bound on this probability is that when $X \sim \operatorname{Poisson}(x_1)$ and $k>x_1$, $$\Pr[X \ge k] \le e^{-x_1}\left(\frac{ex_1}{k}\right)^k.$$ (This is a Chernoff-type bound for Poisson random variables that ought to be standard, but somehow isn't. At one point, I was acutely feeling the lack of it, and found the above inequality in this paper. It's proved in the usual way, by applying Markov's inequality to $\Pr[e^{sX} \ge e^{sk}]$ and optimizing the choice of $s$.)

If we remove $\max\{X-k,0\}$ edges from vertex $v$, then the average number of edges removed is constant: it is (assuming $k > ex_1$) $$\sum_{t=k}^\infty \Pr[X \ge t] < e^{-x_1} \sum_{t=k}^\infty \left(\frac{ex_1}{k}\right)^t = e^{-x_1} \left(\frac{e x_1}{k}\right)^k \frac{k}{k - ex_1}.$$ We can make this constant be as small as we like by increasing $k$. So choose $k$ such that this value is less than $x_2 x_3$.

In expectation, then, we remove $x_2 x_3$ edges from every vertex, for at most $x_2 x_3 n$ edges removed total. Again, this is still just an expected value; but if $Y$ is the number of edges removed and $\mathbb E[Y] \le x_2 x_3 n$, then by Markov's inequality, $$\Pr[Y \ge x_2 n] \le \frac{\mathbb E[Y]}{x_2 n} \le x_3.$$