Mike Rosing (eresrch@msn.fullfeed.com)
Sun, 12 Jul 1998 20:37:35 -0500 (CDT)
On Sun, 12 Jul 1998, Carl Ellison wrote:
>
> Entropy for cryptographic purposes is a measure of the number of possible
> states (therefore, of uncertainty) from the point of view of the best
> attacker. To be most correct, we should refer to conditional entropy --
> the entropy of a system conditional on the information available to any (or
> the best) attacker.
And another person sent me this:
:Entropy is conveniently defined as = -Information and proportional to
:- Sum over i of Pi log(Pi), where Pi is the probability of the
:ith "event". For one digital bit, the entropy would be
:S = - P0 log(P0) - P1 Log(P1), where P0 is the probability of
:a zero and P1 the Probability of a 1 and P0+P1 = 1.
:[Khinchin, "The Mathematical Foundations of Information Theory",
:Dover, NY, 1957].
Entropy is a measure of possible states. What are all the possible
states the attacker knows? They know the input mechanism for bit
generation. If a pass phrase is used, then a keyboard using ASCII is the
most likely generator. If a time base is used, the attacker has to know
something about the hardware (how well it counts time, how well it
records events, etc.). We can assume the attacker does not know the key,
but we can also assume they can find a duplicate piece of equipment and
measure the hardware to their hearts content.
If the attacker knows the generator, then they know the entropy of the
system being attacked (because they can measure the above proabilities).
If a "good" source of randomness (many states) is mixed with a "bad"
source of randomness (few states), chances are the "good" source is lost.
Unless the "mixing" is concatenation, I don't see how to overcome this.
(I think a continuous hash of the concatenation may be a good mixer, but
it's not very practical).
Patience, persistence, truth,
Dr. mike
The following archive was created by hippie-mail 7.98617-22 on Fri Aug 21 1998 - 17:20:18 ADT