- Is there an $O(n log n)$ algorithm which does not use FFT?
- Are better algorithms known (i.e $o(n log n)$)? (FFT allowed).
Asked By : Aryabhata
Answered By : Realz Slaw
#http://cs.stackexchange.com/q/11418/2755 def F(a): n = len(a) for i in range(n): assert a[i] >= 0 # (r) => coefficient # coefficient cdot x^{r} S = {} for ai in a: for aj in a: r = ai + aj if r not in S: S[r] = 0 S[r] += 1 return list(S.items()) def SQUARE(x): x = int(x) a = [] i = 0 while x != 0: if x & 1 == 1: a += [i] x >>= 1 i += 1 print 'a:',a P2 = F(a) print 'P^2:',P2 s = 0 for e,c in P2: s += (1 << e)*c return s
3. Thus, squaring is at most as hard as this problem. 4. Therefore, integer squaring is equivalent to this problem. (they are each at most as hard as each-other, due to (2,3,1)) Now it is unknown if integer/polynomial multiplication admits bounds better than $mathcal O(nlog n)$; in fact the best multiplication algorithms currently all use FFT and have run-times like $mathcal O(n log n log log n)$ (Schönhage-Strassen algorithm) and $mathcal Oleft(n log n,2^{mathcal O(log^* n)}right)$ (FĂŒrer’s algorithm). Arnold Schönhage and Volker Strassen conjectured a lower bound of $Omegaleft(n log nright)$, and so far this seems to be holding. This doesn’t mean your use of FFT is quicker; $mathcal Oleft(nlog nright)$ for FFT is the number of operations (I think), not the bit complexity; hence it ignores some factors of smaller multiplications; when used recursively, it would become closer to the FFT multiplication algorithms listed above (see Where is the mistake in this apparently-O(n lg n) multiplication algorithm?). 5. Now, your problem is not exactly multiplication, it is squaring. So is squaring easier? Well, it is an open problem (no for now): squaring is not known to have a faster algorithm than multiplication. If you could find a better algorithm for your problem than using multiplication; then this would likely be a breakthrough. So as of now, the answer to both your questions is: no, as of now, all the ~$mathcal O(nlog n)$ multiplication algorithms use FFT; and as of now squaring is as hard as multiplication. And no, unless a faster algorithm for squaring is found, or multiplication breaks the $mathcal O(nlog n)$ barrier, your problem cannot be solved faster than $mathcal O(n log n)$; in fact, it cannot currently be solved in $mathcal O(nlog n)$ either, as the best multiplication algorithm only approaches that complexity.
Best Answer from StackOverflow
Question Source : http://cs.stackexchange.com/questions/11418