Enzo Michelangeli (em@who.net)
Mon, 20 Jul 1998 08:54:52 +0800
From: Mike Rosing <eresrch@msn.fullfeed.com>
Date: Sunday, July 19, 1998 10:50 AM
[...]
>Then the average autocorrelation should be zero over lots of intervals for
>something "approaching" white asymptoticly.
This is common to both RNG's and PRNG's.
>> So, you are in a _double_ state of sin: you think that algorithms can do
>> better than real random generator! :-)
>
>Algorithms massage real data, be they op amps or digitizers. Algorithms
>for PRNG are not random at all, they can be tuned to pass any kind of
>test for randomness one can think of. Real "randomness" is an untunable
>signal, to be useful it must be outside our direct ability to control.
Not really, any function of random variables is still a random variable with
a different distribution. If you combine values of different samples, as,
e.g., in a moving average, you also modify higher-order statistics
(resulting, for example, in a different poer spectrum). "Random" does not
mean "totally wild". Conversely, random-looking sequences may actually turn
out to be absolutely deterministic (see below).
>What is a Borel number? The estimate of entropy can be done statisticly
>like temperature, with enough data you can get the probability to within
>some useful range. The math looks like it'll be fun.
I meant what Emile Borel called "normal numbers": those whose decimal
expansion looks like random. "Absolutely normal" are those whose expansion
is random in any base.
Rational number are periodic, and therefore can't belong to that class. One
can also build irrational numbers which are not normal numbers, like
0.1001000100001000001... , but it is conjectured that most common irrational
numbers (e, pi, sqrt(2) etc.) are normal. In fact, Borel proved that the
probability that a real number in (0,1) is absolutely normal is 1, even
though we can prove this property for precious few.
Anyway, this was just an example of random-looking but deterministic
sequences. Back to our discussion: let's now suppose I have black box
containing a good PRNG with provable statistical properties, such as
Blum-Blum-Shub. Once I chose its initialization seed, it evolves in a
perfectly deterministic fashion, so the total entropy of its output can't
exceed the number of bits of the seed. How can you estimate that figure,
just analyzing the outbut of my black box?
Enzo
The following archive was created by hippie-mail 7.98617-22 on Fri Aug 21 1998 - 17:20:38 ADT