bill.stewart@pobox.com
Tue, 07 Jul 1998 09:11:52 -0700
>> > Seems to me like he's being very conservative about the entropy of the
>> > data source - 1 bit for two (detected) decays. Surely we can get a bit
>> > (or two) more out of it than that?
>>
>> This is cryptography. Those that are conservative live. Those that are
>> "risk-oriented" end up as "don't let this happen to you" stories in
>> the pages of a future David Kahn's book on the history of cryptography.
The number of bits you have and the number you can trust are different,
and for some applications it may be worth hanging on to the lowgrade bits
even if you're still conservatively estimating the entropy.
For instance, radioactive bits may be just one seed for your
randomness pool, along with soundcard noise, keystroke timing, salt, etc.,
either because you need more bits/second than the radioactivity gives you,
or because you're concerned about TEMPEST eavesdropping detecting
some of the bits you're sending from your sub-basement Geiger counter,
or about electrical noise being added on the data lines,
or because your laptop doesn't have room for /dev/gamma and you don't want to
run out of randomness on the plane back from Anguilla.
Doing the threshhold detection on (t1<t2)?0:1 clearly gives
you a solid bit, but you may want to hash t1 and t2 into the pool as well.
Statistical analysis on the sample times may also tell you
if there's something changing in your system, whether it's
a solar flare, Geiger tube burning out, 60Hz noise from something, etc.
Thanks!
Bill
Bill Stewart, bill.stewart@pobox.com
PGP Fingerprint D454 E202 CBC8 40BF 3C85 B884 0ABE 4639
The following archive was created by hippie-mail 7.98617-22 on Fri Aug 21 1998 - 17:20:10 ADT