Cicero (cicero@redneck.efga.org)
15 Jul 1998 19:55:06 -0000
>>>My big question is this: are there tools for taking a set of random
>>>numbers dispersed according to a non-uniform distribution, like a
>>>poisson or normal distribution, and turning them into a set of random
>>>numbers over a uniform distribution? Given such tools, timing
>>>intervals between the geiger counter ticks is probably safe --
>>>otherwise, it may skew the results subtly.
>>
>>What do you see as the problems with:
>> 1. Hash the data
>
>This doesn't improve the entropy, almost by definition -
>it just hides the lack of entropy, and smears it around.
Yes. I wasn't being clear if I gave the impression that I thought
that any processing could increase entropy. Invertible processing,
such as encryption, provably cannot either increase or decrease
entropy. Non-invertible processing, such as hashing, cannot increase,
and can decrease entropy. I interpret Geiger's original post as a
request for "whitening". Is there a different interpretation of:
"turning them into a set of random numbers"?
The hashing in 1. above is intended as entropy concentration.
<snip>
>if you're going to play games with hashing or pools,
>it can make sense to keep both parts, e.g.
> Hash( raw_data, whitened_data )
Yes, if you are relying on a hash to do entropy compression, as I am
advocating here, you will always benefit by hashing everything. Any
throwing away of data is potentially shooting yourself in the foot
with no possible advantage (except the hash is a little faster).
>>What do you see as the problems with:
>> 1. Hash the data
>> 2. Encrypt the data in CBC mode with the hash as key
>>If the hash and cipher are both strong, this should be good.
>
>I don't trust the latter step - you're using the encryption
>as a hash function, which it wasn't designed for,
>rather than using a hash function that _was_ designed for hashing.
I don't think that I am using CBC as a hash in 2. I could have used
CBC-hash for 1., and your argument might be raised there, but I
didn't say what hash I was using in 1.
In 2. I am using a block cipher to mix the entropy from the hash into
the data. The size of the hash output is a bottleneck here, and this
would be a good application for hash with large output block. AES
will provide a 256 bit hash in either tandem or abreast Davies-Meyer.
Perhaps NIST will have a competition for a 256 bit successor to SHA-1.
Anyway, I contend that I am not using the encryption as a hash in 2.,
but as whitening.
Cicero
The following archive was created by hippie-mail 7.98617-22 on Fri Aug 21 1998 - 17:20:25 ADT