estimate value of maximum cut in graph by random sampling

941 Views Asked by At

I have an unweighted, undirected graph G=(V,E) from which I am sampling a set S of $\frac{kn}{ε^2}$ edges uniformly at random, where k is constant and ε is a variable parameter. From this set S I want to find an estimate of the value of the max cut (A,B) of G, defined as $|(A×B)∩E|$. In particular I must show that if I compute: $$|(A×B)∩S|·\frac{|E|}{|S|}$$ I can obtain an estimate that is within (1±ε) the value of the max cut (A,B) of G. Now, I am quite confused on how to approach this. I guess I should first compute the expected value of $|(A×B)∩S|$, but I am not really sure how to do it. Any suggestion or reference would be much appreciated.

EDIT: a further related question following the answer below. Consider another cut $(C,D)$ in G, which is not the max cut. Its estimated value will similarly be: $$|(C×D)∩S|·\frac{|E|}{|S|}$$ What is the probability that this value is greater than $|(C×D)∩E|+ε·|(A×B)∩E|$? In this case I would answer that this probability is $\le\frac{1}{kn}$ because I can apply the Chebyshev's inequality (or a different one, such as Chernoff) to show that for the cut $(C,D)$ it is also true that: $$Pr[|μ¯−μ|≥ϵμ]\le\frac{1}{kn}$$ which means that the probability that: $$|(C×D)∩S|·\frac{|E|}{|S|} \gt (1+ε)|(C×D)∩E| $$ is $\le\frac{1}{kn}$, i.e. quite small. And since: $$(1+ε)|(C×D)∩E| \le [|(C×D)∩E|+ε·|(A×B)∩E|]$$ the same probability applies. Is this a wrong approach?

1

There are 1 best solutions below

3
On BEST ANSWER

It is an application of concentration inequality. For $1 \leq i \leq |S|$, define random variable $$ X_i = \begin{cases} 1 & \text{if the $i$-th edge in $S$ belongs to the max cut $(A, B)$} \\ 0 & \text{otherwise} \end{cases} $$ Then, we have $$ \Pr[X_i = 1] = \frac{|(A \times B) \cap E|} {|E|} \tag{$1$} $$ and $$ X_1 + X_2 + \cdots + X_{|S|} = |(A \times B) \cap S| \tag{$2$} $$ Denote by $\mu = \frac{|(A \times B) \cap E|}{|E|}$ the ratio of the # of edges in the max cut to $|E|$ and by $\bar{\mu} = \frac{|(A\times B) \cap S|}{|S|}$ an estimation of $\mu$. We have \begin{align} \mathsf{E}[\bar\mu]\ =\ &\mathsf{E}\left[\frac{|(A \times B) \cap S| }{|S|}\right] \\ {\small\text{(by Eq. $(2)$)}}\quad=\ &\frac{1}{|S|} \cdot \mathsf{E}[X_1 + \cdots + X_{|S|}] \\ {\small\text{(by linearity of expectation and Eq. $(1)$)}}\quad=\ &\frac{1}{|S|}\cdot |S| \cdot \frac{|(A \times B) \cap E|}{|E|} \\ =\ &\frac{|(A \times B) \cap E|}{|E|} = \mu \end{align} That is, $\bar{\mu}$ is an unbiased estimator of $\mu$. Further, we have $$ \mathsf{Var}[\bar\mu] = \frac{1}{|S|^2} \mathsf{Var}(|(A \times B) \cap S|) = \frac{1}{|S|^2}\mathsf{Var}(X_1 + \cdots + X_{|S|}) = \frac{1}{|S|} \mathsf{Var}(X_1) \leq \frac{1}{4|S|} \tag{$3$} $$ By the Chebyshev's inequality and the fact that a max cut has at least $|E| / 2$ edges (i.e., $\mu \geq \frac{1}{2}$), $$ \Pr[|\bar\mu - \mu| \geq \epsilon \mu] \leq \frac{\mathsf{Var}[\bar\mu]}{\epsilon^2 \mu^2} \leq \frac{1}{4|S|\cdot \epsilon^2 \cdot (1/2)^2} = \frac{1}{kn} $$ In other words, the event that $( 1- \epsilon)\mu \leq \bar\mu \leq ( 1 + \epsilon) \mu$ holds with probability at least $1 - \frac{1}{n}$ when $k = 1$. By multiplying $\mu$ and $\bar\mu$ simultaneously by $|E|$, you will get the result you want.