While studying Turan's theorem, I would like to ask about some parts of this proof. I am weak at probability, so unless I spread and open everything up and formalize, I would not be able to understand. I have checked some posts in MSE too but either they asked different things or I just could not understand them.
I want to know why every graph $G$ of $n$ vertices has a clique of size $\geq \sum_{v\in V(G)}\frac{1}{n-d(v)}$. So, here is my understanding so far:
In a few notes I have checked, they start by considering a random permutation $p$ on $V=V(G)$, and consider the set $Q_p$, which is the set of vertices adjacent to all vertices on their left in $p$. I can see why this is a clique. Next, consider the random variable $X=|Q_p|$. Here is my confusion: (Question 1) Does this mean the sample space is $S=\{Q_p\mid p \text{ is an permutation of }V\}$, or just $S=\{p\mid p \text{ is a permutation of }V\}$?
From that definition of $X$, I really thought it can only mean the first sample space, but now I am guessing that it is actually the second one. That means the random variable $X$ assigns any $p$ to $|Q_p|$, am I right?
And so, if $v$ is any vertex, consider the random variable $X_v$, defined such that $X_v(p)=1$ if $v\in Q_p$ and $0$ otherwise. Then, $X=\sum_{v\in V(G)}X_v$. (Question 2: If I use the first $S$ instead and then define $X_v(Q_p)$ in the same way, then this summation still holds (whether it is useful or not), right?).
From here, I simply need to use linearity of expectation, but I need the quantity $E[X_v]=P(X_v=1)$. To get this, I try to count everything, which is
$$P(X_v=1)=P(\{p\mid X_v(p)=1\})=\frac{|\{p\mid X_v(p)=1\}|}{n!}.$$
The numerator is the same as the number of $p$ such that $v\in Q_p$, or eqv such that $v$ appears before any of its non-neighbours (there are $n-d(v)-1$ of them without $v$ itself). To count this, we first put $v$ and all non-neighbors of $v$ in the $n$ boxes (done in $\binom{n}{n-d(v)}$ ways), permute them except $v$ since it has to appear first (done in $(n-d(v)-1)!$ ways), and permute the rest in $d(v)!$ ways. Thus,
$$P(X_v=1)=\frac{\binom{n}{n-d(v)}(n-d(v)-1)!d(v)!}{n!}=\frac{\frac{1}{(n-d(v))!}(n-d(v)-1)!}{1}=\frac{1}{n-d(v)}.$$
And we are done. Is this a correct understanding? If I use the first sample space instead, can this result be obtained too? I am having trouble counting $Q_p$ if I use the first $S$.
Lastly, without counting everything, is there any other way to get the $\frac{1}{n-d(v)}$ quicker? Thank you.
EDIT: The rest follows from first moment method.
The correct sample space is in fact the set of all permutations of $V$. That's why the solution begins "consider a random permutation of $V$". If the solution were written a bit more carefully, it would begin "consider a uniformly random permutation of $V$", which tells you that you are sampling uniformly from the set of all permutations.
Using a random $Q_p$ gets you into trouble. If your sample space is "subsets $Q_p$ of $V$", then you're not sampling uniformly. So how are you sampling? You don't know. Then you can't compute any probabilities.
As for the quick way to see that $\Pr[X_v = 1] = \frac1{n - d(v)}$, let $S \subseteq V$ consist of $v$, together with all vertices not adjacent to $v$. We have $X_v = 1$ if $v$ is the leftmost vertex of $S$ in the permutation (no matter where vertices of $V-S$ go).
A uniformly random permutation of $V$ induces a uniformly random permutation of $S$. The probability that $v$ is the first vertex in this uniformly random permutation of $S$ is $\frac1{|S|}$, by symmetry: any of the $|S|$ vertices in $S$ is equally likely to be first. So $\Pr[X_v = 1] = \frac1{|S|} = \frac1{n-d(v)}$.