Dave Emery (die@pig.die.com)
Mon, 6 Jul 1998 21:12:34 -0400
On Mon, Jul 06, 1998 at 06:46:51PM -0400, Perry E. Metzger wrote:
>
> > Perry E. Metzger wrote:
> Ben Laurie writes:
> > >
> >
> > Seems to me like he's being very conservative about the entropy of the
> > data source - 1 bit for two (detected) decays. Surely we can get a bit
> > (or two) more out of it than that?
>
> This is cryptography. Those that are conservative live. Those that are
> "risk-oriented" end up as "don't let this happen to you" stories in
> the pages of a future David Kahn's book on the history of
> cryptography.
>
> Personally, I don't know if one bit per two decays is sufficiently
> conservative for my tastes.
>
> People really have to get it through their heads that this is one
> field where, when you don't know an answer, you *have* to behave as
> though the worst is true, not the best.
>
> Perry
Perhaps I'm missing something big time, but it seems to me that
one could time the interval between radioactive events detected by the
GM tube using a fast binary hardware counter counting a clock signal
from a stable quartz crystal oscillator, a counter that would be
expected to count many tens of thousands to many millions of counts
between events and then use some hash function of the low order few bits
of the count as several bits of random entropic data, thus harvesting
several to many bits per decay. While there are some extremely subtle
biases that could in theory sneak into such a scheme if not carefully
engineered [GM tube dead time and non random noise in the signal
processing electronics following the tube come to mind], my
understanding of radioactive decay is that it is a truly random process
and the interval between events detected by an ideal detector should
therefore be a random interval. And the art of measuring time intervals
with high precision and repeatablity is a very well developed and mature
technology.
As someone with EE rather than mathematical cryptography
background I am always a bit amazed at the problems that some software
types see in finding randomness - for from an EE perspective it is
everywhere in the form of random noise that has to be kept at bay by
careful design techniques to keep it from causing errors every few
thousand or million or billion operations of deterministic digital
systems. Designing most hardware is an exercise in creating circuits in
which the probablity of non deterministic behavior or data errors caused
by noise is low enough not to be of concern, but never zero.
Any electrical resistive device at a temperature above zero
kelvin has Johnson noise across its terminals, and this noise is just as
theoretically statisticly random as the "noise" of radioactive decay.
And much easier to conveniantly and safely harvest than using radioactive
sources and detectors. The roaring white noise that comes out of
a FM radio tuned to an empty channel is an example of Johnson noise from
the rf front end of the receiver amplified to high levels and should
be a good source of random bits provided that there is no signal sneaking
in. And coming up with similar (and more reliably random) sources
of Johnson noise is easy.
At the very most, simple minded approaches to harvesting Johnson
noise may introduce very slight biases in the numbers of zero or one
bits or corellations between adjacent bits, but there are a number of
post processing techniques that eliminate these errors, and more
sophisticated sampling techniques can eliminate most of them to
begin with.
-- Dave Emery N1PRE, die@die.com DIE Consulting, Weston, Mass. PGP fingerprint = 2047/4D7B08D1 DE 6E E1 CC 1F 1D 96 E2 5D 27 BD B0 24 88 C3 18
The following archive was created by hippie-mail 7.98617-22 on Fri Aug 21 1998 - 17:20:07 ADT