Problem Detail: I’m taking the MIT Open Courseware for Introduction to Algorithms and I’m having trouble understanding the first homework problem/solution. We are supposed to compare the asymptotic complexity (big-O) for different functions: $f_1(n) = n^{0.999999}log(n)$
$f_2(n) = 10000000n$ $f_2(n)$ is obviously O(n), but the big-O given for $f_1(n)$ confuses me and I don’t follow the given solution. The solution says $f_1(n)$ has less Big-O complexity than $f_2(n)$: “recall that for any $c > 0$, $log(n)$ is $O(n^c)$. Therefore we have: $f(n) = n^{0.999999}log(n) = O(n^{0.999999}n^{0.000001}) = O(n) = O(f_2(n))$” I do not understand the logic underlying the solution. I may be forgetting something simple? Can someone break it down for me? Thanks!
$f_2(n) = 10000000n$ $f_2(n)$ is obviously O(n), but the big-O given for $f_1(n)$ confuses me and I don’t follow the given solution. The solution says $f_1(n)$ has less Big-O complexity than $f_2(n)$: “recall that for any $c > 0$, $log(n)$ is $O(n^c)$. Therefore we have: $f(n) = n^{0.999999}log(n) = O(n^{0.999999}n^{0.000001}) = O(n) = O(f_2(n))$” I do not understand the logic underlying the solution. I may be forgetting something simple? Can someone break it down for me? Thanks!
Asked By : chrishiestand
Answered By : Gilles
If $g = O(g’)$ and $h$ is nonnegative then $h cdot g = O(h cdot g’)$ — in English, if $g$ grows slower than $g’$ then multiplying by a positive function doesn’t affect that fact. (If you aren’t sure about this, do the proof with quantifiers). Since $log(n) = O(n^{0.000001})$, $n^{0.999999} cdot log(n) = O(n^{0.999999} cdot n^{0.000001})$. Since $n^{0.999999} cdot n^{0.000001} = n$, we have $f_1(n) = O(n)$. While it is true that $f_2(n) = O(n)$, this is not relevant here. What you need is $n = O(f_2(n))$, which is also true. Intuitively, a higher exponent trumps a logarithm. A multiplicative constant is not significant.
Best Answer from StackOverflow
Question Source : http://cs.stackexchange.com/questions/14864