Question Detail: There are many applications where a pseudo random number generator is used. So people implement one that they think is great only to find later that it’s flawed. Something like this happened with the Javascript random number generator recently. RandU much earlier too. There are also issues of inappropriate initial seeding for something like the Twister. I cannot find examples of anyone combining two or more families of generators with the usual xor operator. If there is sufficient computer power to run things like java.SecureRandom or Twister implementations, why do people not combine them? ISAAC xor XORShift xor RandU should be a fairly good example, and where you can see the weakness of a single generator being mitigated by the others. It should also help with the distribution of numbers into higher dimensions as the intrinsic algorithms are totally different. Is there some fundamental principle that they shouldn’t be combined? If you were to build a true random number generator, people would probably advise that you combine two or more sources of entropy. Is my example different? I’m excluding the common example of several linear feedback shift registers working together as they’re from the same family.
Asked By : Paul Uszak
Best Answer from StackOverflow
Question Source : http://cs.stackexchange.com/questions/57648
Answered By : Amateur
IIRC (and this is from memory), the 1955 Rand bestseller A Million Random Digits did something like this. Before computers were cheap, people picked random numbers out of this book. The authors generated random bits with electronic noise, but that turned out to be biassed (it’s hard to make a flipflop spent exactly equal times on the flip and the flop). However, combining bits made the distribution much more uniform.