Immediately I want to say I am not really a Math person, so I am asking a bit of help here. The problem I am studying is a maximum clique in random graphs. I was looking to the lecture notes here, and I am wondering how to prove the "Exercise" in the very beginning. The same thing is here, in the Bollobas' combinatorica paper, where again he says: "it is easily checked that..." when computing $d(n)$ on the second page on the top (number 420), which is the same thing. What is considered easy to those guys seems to be not easy at all to me, so can somebody please clarify how to do that "easy" things.
Thank you for your help.
EDIT: you have a function $g(k) = \binom{n}{k} p^{\binom{k}{3}}$ which is the expected number of $k$-cliques in the graph with $n$ vertices. The question is to find largest $k$ where this is non-zero, i.e. $\geq 1$. As far as I understood $k$ is somewhere close to $2\log(n)/ \log(\frac{1}{p})$. I'm looking for a rigorous proof of this with every "trivial" step of clarification because I am a CS person and even easy tricks might be difficult to me. I need this proof in order to apply to slightly different though similar relations.
Answer could be found in the following book, section about the largest independent set.
https://www.math.cmu.edu/~af1p/BOOK.pdf