Asked By : user1374864
Answered By : HdM
I know that $E(X_k)=tbinom{n}{k}cdot p^{tbinom{k}{2}}$, but how do I prove it?
You use the linearity of expectation and some smart re-writing. First of all, note that $$ X_k = sum_{T subseteq V, , |T|=k} mathbb{1}[T text{ is clique}].$$ Now, when taking the expectation of $X_k$, one can simply draw the sum out (due to linearity) and obtain $$ mathrm{E}(X_k) = sum_{T subseteq V, , |T|=k} mathrm{E}(mathbb{1}[T text{ is clique}]) = sum_{T subseteq V, , |T|=k} mathrm{Pr}[T text{ is clique}]$$ By drawing out the sum, we eliminated all possible dependencies between subsets of nodes. Hence, what is the probability that $T$ is a clique? Well, no matter what $T$ consists of, all edge probabilities are equal. Therefore, $mathrm{Pr}[T text{ is clique}] = p^{k choose 2}$, since all edges in this subgraph must be present. And then, the inner term of the sum does not depend on $T$ anymore, leaving us with $mathrm{E}(X_k) = p^{k choose 2} sum_{T subseteq V, , |T|=k} 1 = {n choose k} cdot p^{k choose 2}$.
How to show that for $nrightarrowinfty$: $E(X_{log_2n})ge1$
I am not entirely sure whether this is even correct. Applying a bound on the binomial coefficient, we obtain $$E(X_{log n}) = {n choose log n} cdot p^{log n choose 2} leq left(frac{n e p^frac{(log n)}{4}}{log n}right)^{log n} = left(frac{ne cdot n^{(log p) / 4}}{log n}right)^{log n}.$$ (Note that I roughly upper bounded $p^frac{-1 + log n}{2}$ by $p^frac{log n}{4}$.) However, now one could choose $p = 0.001$, and obtain that $log_2 0.001 approx -9.96$, which makes the whole term go to $0$ for large $n$. Are you maybe missing some assumptions on $p$?
Best Answer from StackOverflow
Question Source : http://cs.stackexchange.com/questions/2118