Mike Rosing (eresrch@msn.fullfeed.com)
Mon, 20 Jul 1998 08:36:58 -0500 (CDT)
On Mon, 20 Jul 1998, Enzo Michelangeli wrote:
> Not really, any function of random variables is still a random variable with
> a different distribution. If you combine values of different samples, as,
> e.g., in a moving average, you also modify higher-order statistics
> (resulting, for example, in a different poer spectrum). "Random" does not
> mean "totally wild". Conversely, random-looking sequences may actually turn
> out to be absolutely deterministic (see below).
In a hardware RNG I kind of know what the random variables are, and I can
measure how wild they are. What makes nuclear decay so interesting is
that some nobel prize winners don't think it is purely random, there
are "hidden variables". A PRNG is totally wild, why isn't that random?
> I meant what Emile Borel called "normal numbers": those whose decimal
> expansion looks like random. "Absolutely normal" are those whose expansion
> is random in any base.
> Rational number are periodic, and therefore can't belong to that class. One
> can also build irrational numbers which are not normal numbers, like
> 0.1001000100001000001... , but it is conjectured that most common irrational
> numbers (e, pi, sqrt(2) etc.) are normal. In fact, Borel proved that the
> probability that a real number in (0,1) is absolutely normal is 1, even
> though we can prove this property for precious few.
Thanks! I don't recall seeing that before, always nice to learn more
math.
> Anyway, this was just an example of random-looking but deterministic
> sequences. Back to our discussion: let's now suppose I have black box
> containing a good PRNG with provable statistical properties, such as
> Blum-Blum-Shub. Once I chose its initialization seed, it evolves in a
> perfectly deterministic fashion, so the total entropy of its output can't
> exceed the number of bits of the seed. How can you estimate that figure,
> just analyzing the outbut of my black box?
Entropy is a measure of the number of states. Your box will spit out a
certain number of bits per second. If you do them serially, so that a
new block of bits is computed in the time it takes to output the block,
then the bit rate is measure of the entropy: I'll have an upper and
lower bound. By studying some "reasonable range" of bit blocks I should
be able to narrow the bounds. I may not be able to collapse them without
guessing the PRNG algorithm, but I'll have a good idea.
Patience, persistence, truth,
Dr. mike
The following archive was created by hippie-mail 7.98617-22 on Fri Aug 21 1998 - 17:20:39 ADT