Vin McLellan (vin@shore.net)
Wed, 24 Mar 1999 13:19:51 -0500
This is a second bulletin from the Rome AES2 conference, as posted
to the sci.crypt newsgroup by the perceptive Dianelos Georgoudis, one of the
designers of FROG, an AES candidate. The NIST website offers most of the
AES2 papers at:
<http://csrc.ncsl.nist.gov/encryption/aes/round1/conf2/aes2conf.htm>.
_Vin
----------
Subject: Re: Live from the Second AES Conference
Date: Wed, 24 Mar 1999 14:18:38 GMT
From: dianelos@tecapro.com
Newsgroups: sci.crypt
Here is my report on the second and last day of AES2.
First came five papers with analysis of several algorithms (in
general it was felt that the analytic results of the first round
were relatively few):
Sean Murphy described properties of Twofish pertaining to its key
schedule. First it was shown that the whitening keys (XORed with
the data before and after the main cipher) are not uniformly
distributed, i.e. take not all possible values. In a similar vain,
each 8-byte round subkey also cannot take all possible values.
Ideally, of course, all key material within a cipher should be as
close to random as possible, otherwise an attacker gains some
predictive power. Later in the afternoon workers from the Twofish
team explained exactly how much entropy was lost: the whitening
key has 117 bits of entropy (instead of the optimal 128) and each
round key has 63.2 bits of entropy (instead of 64). They claimed
this is not a useful cryptanalytic property which does seems
reasonable.
John Kelsey, also from the Twofish design team, described a
weakness in Safer+'s key schedule (256 bit key variant) that led
to a possible attack requiring 2^200 encryptions - clearly a very
theoretical attack. He finished suggesting an improvement to the
cipher's key schedule. Later in the afternoon, Safer+'s designer,
James Massey, conceded that his main attention was invested in the
128 bit variant and described a similar fix.
Vincent Rijmen, one of Rijndael's designers, explained the attack
on LOKI97, the first serious published attack against an AES
candidate: a differential attack that needs 2^56 chosen plaintexts
and a lineal attack that need 2^56 known plaintexts but works only
on 2^(-25) of the keys.
Then came David Wagner's exposition of the attack on Frog. David,
by the way is still only a student in California. I liked his
description of Frog as a cipher where the "wires" are parametrized
to the key. This means that Frog looks like a large collection of
individual ciphers with different internal wiring. The basic point
is that the attacker can search and choose the instances where
this wiring defines a weaker cipher, and then mount an attack on
the weak key that gave rise to this configuration. (For those
familiar with Frog the weak wiring happens when the bombpermu
pointer addresses the previous byte.) It turns out that 2^(-33) of
the keys are now weak and that in these cases the key can be
broken with 2^58 chosen plaintexts. The decryption function of
Frog has much slower diffusion (something first noticed by John
Savard, a frequent participant in sci.crypt) which allows for a
more successful attack that requires 2^36 chosen ciphertexts and
works for 2^(-29.3) of the keyspace. A linear attack on Frog was
also described.
Frog-like ciphers do have this kind of problem: by putting some or
most of the functional definition of the cipher within the key,
the possibility exists that some of these functions will be weak.
I suppose there are ways to fix Frog in order to strengthen it
against the differential and lineal characteristics discovered by
Wagner and achieve a much lower or zero number of weak keys,
trivially: suppress the weak wiring configurations. (I did ask him
to propose a fix but he was not forthcoming - I wonder whether
using ADDs instead of XORs for diffusion would cut it). The really
interesting question though is to find out if a generalized
version of Frog that just follows its design principles to a
higher degree would increase or decrease the frequency of Wagner
weak keys. The whole idea in Frog is to do nothing designed
specifically as a defense against a particular attack. So if
Generalized Frog turns out to be weaker against Wagner's attack
then it is bad news for the basic idea. But, if it turns out that
"purer" (even less structured) forms of Frog are more resistant,
then confidence in Frog's design principle will increase. So
Wagner's attack can be used as a platform to evaluate the design
methodology, something even more important than the candidate
cipher itself.
Finally Eli Biham described the attack against Magenta. The attack
works with 2^64 chosen plaintexts and 2^64 complexity or 2^33
known plaintexts but 2^97 complexity. In the afternoon, Magenta's
Klaus Huber gave a very short rebuttal, explaining that this
weakness can easily be fixed and that Magenta does have some
interesting theoretical and implementational properties. In
software it is relatively slow though.
Next came a series of theoretical papers about various candidates:
Jaques Stern of DFC described an update of the cipher that is
faster and can be made scalable to allow for different block sizes
(a useful property of several other candidates). The DFC team
worked really hard to have decorrelation theory serve as the
theoretical underpinning of their cipher. (Also it seems that
decorrelation modules can be inserted into other ciphers such as
E2.) He explained that the proofs apply to an idealized version of
the cipher that DFCs approximates. By the way, later this
afternoon somebody asked whether "proofs" are really important in
cipher design. Bruce Schneier's reasonable comment was that any
proven property in a cipher increases the quality of its analysis
and hence its security. He also made clear that proofs discuss a
specific thread model and that general proofs for all possible
threats are not really possible. Well, I tried to go the other way
and asked Jaques whether DFC's designers had put any thought in
defending against unknown attacks (the proofs on DFC pertain only
to first order differential attacks). He clearly disliked the
question and his answer was somehow patronizing in the sense that
it is meaningless to discuss unknowns. There was laughter around
the room but I really think this is no laughing matter: Every ten
years or so a new powerful cryptanalytic technique is discovered.
In AES's life span maybe five new types of attack will be
published. Only in the last year unforeseen and extremely powerful
attacks against smart cards were demonstrated. Also, several
ciphers including Mars, E2 and Twofish (not to mention Frog) did
include in their documentation at least some reasoning about their
strength against unknown attacks. I understand that mathematicians
prefer to work with well defined problems and consider vague
problems as problems not worth thinking about, but ciphers are
machines to be used in the real world where the givens are not
always clear-cut.
Kazukuni Kobara next discussed a very theoretical paper on
pseudorandomness. After her talk somebody had to ask her how this
work relates to E2, to which she answered, as if just remembering,
that E2 does indeed achieve optimal pseudorandomness after two
rounds. It is interesting to observe how cultural characteristics
have an impact in this international process. The designers of the
Asian candidates (E2 and Crypton) are really far too modest and
this, I think, hurts their chances. American designers, of course,
are the most crass, and I think this is a winning strategy.
Next, James Massey explained why Safer+ diffusion is optimal.
Safer+ is a mathematician's kind of cipher and its weakness were
discovered in the key scheduler, precisely the place where math
does not help a lot.
RSA's Scott Contini quantified the differential properties of
data-depending rotations followed by multiplications as used in
MARS and RC6 - a technique that appears to be powerful and is also
under the cloud of a RSA patent. In general, I am in favor of
patents but not in favor of being able to patent such broad and
obvious ideas as using data-depending rotations in ciphers; after
all this is an available machine instruction. My very first cipher
design done several years ago, GodSave, uses data-depending
rotations like crazy and I had no idea that RSA was using them
too.
Finally, Dough Whiting described some new results on Twofish. Even
better speeds and better implementation flexibility (even though
it appears that the very fastest code is self-modifying - this may
look "unpure" to some but I think that it is a valid optimization
technique). Even more carefully crafted cryptanalytic results were
given.
Next came lunch. Today I sat with a guy who works with a large
organization that builds encryption hardware for the government!
Sorry, not NSA; I never met any of these guys even though they
were there - they appeared to be playing No Such Agency. I
certainly would have enjoyed talking to somebody from the NSA. No,
my table companion worked for a private company and the government
in question is UK's. I asked him why government ciphers are still
implemented in hardware and not in software working in dedicated
one chip computers. Apparently speed is not the answer - very high
security communication is really low bandwidth. I didn't find his
answers very convincing: He said that an alpha particle from space
can knock out a transistor in the chip and modify the cipher. Then
again it is a trivial matter to have not one but many CPUs
computing the same cipher in parallel to account for such type of
events. Also he claimed that software correctness was more costly
to verify than a hardware design. Anyhow, I suspect that for all
the mathematical image of cryptography, tradition and the inertia
of ideas does play an important role.
Later, I happened to speak to one of the top cryptographers of the
world and I asked him one of my favourite questions: First assume
two very good ciphers A and B, say two of the better AES
candidates. If your life depended on it and you had to choose
between A.A (doubling A's rounds), B.B (doubling B's rounds) or
A.B (sequentially executing both ciphers), what would you do? The
standard answer is that mixing ciphers is a bad idea because the
resulting cipher could be weaker than each individual one, that
the resulting cipher would be a completely new cipher which would
require fresh analysis which is the only way to gain trust, etc.
Well my companion's answer was "A.B", and he gave precisely the
reasons I expected: a future attack that may work against A could
very well stop at B's gate.
After lunch came a long session with open discussions. Supposedly
candidates would slug it out between themselves while seated in
front of the audience - you know the Christians and lions metaphor
that NIST's Miles had used to explain the choice of Rome for the
second AES conference. Well, to play it safe I sat at the very
back of the room close to the exit. Thankfully, nothing dramatic
happened. A few designers talked for a few minutes defending their
ciphers and that was it. By the way all candidate ciphers, with
the exception of HPC and LOKI97, were represented at this
conference.
Then came a long discussion about intellectual rights (IP) issues.
Miles really seemed unhappy about the whole situation. Here is the
problem: All candidates have renounced their property rights on
their own ciphers if they are declared winners. This was a
condition to enter the competition. So far so good, but even in
the first conference the issue was raised about what would happen
if a looser candidate claimed IP rights on some parts of the
winner algorithm: this certainly could result in a very messy
litigious situation that would defeat one of the basic ideas
behind the AES: that it should be a global royalty-free algorithm.
A few weeks back NIST sent to all candidates an informal question
trying to measure how they stood in this matter. Several of them
(CAST-256, Crypton, Deal, Frog, LOKI97, Rijndael, Serpent and
Twofish) gave a clear positive answer. The others (DFC, E2, HPC,
Magenta, MARS, RC6 and Safer) were not that forthcoming. The
situation is not as bad as it looks. For example, one of the
favorite ciphers, RC6, in only asking for some kind of honorific
recognition if some of its IP is used by the AES winner. Another
favorite, IBM's MARS, had a more convoluted position. They
claimed the NIST's clarification question was not very clear
itself and that it literally said something very different -
therefore their convoluted answer which striked me also as somehow
ambiguous. Overall, my feeling is that NIST will apply gentle arm
twisting behind the scenes before announcing the finalists for the
second round and that it will succeed in getting all the important
designers to agree to play it clean.
After that a lively discussion followed with the participation of
many people from the audience. The main themes:
Smart cards. Several people expressed that the importance that was
given to implementation issues of the AES on a low end smart card
was not reasonable. The standard argument in favor of very tight
implementations on smart cards was that there are cases where
millions of smart cards are needed where even a small price
difference has a large impact. Against that, somebody from the
audience presented the following strong argument: cases where the
AES is going to be used in a smart card are cases where you want
to protect information that is valuable - otherwise you don't
really need a 128 bit key and can use any of well known medium
range ciphers. If you are protecting information that is valuable,
then spending one dollar more for each card is a reasonable
investment. Miles explained that originally NIST had not intended
to require implementations on 8-bit platforms but that the initial
comments supplied by the public were strongly in favor that NIST
should add smart card flexibility to the AES. So, even though it
will be judged positively if a candidate does operate efficiently
on a smart card, he felt that the whole issue was becoming
overrated. My feeling is that the PDA attacks against smart cards
have really confused the situation here - possibly security for
smart cards will have to follow completely novel design paradigms,
maybe even hardware designs. (At the coffee break I approached a
small group of people that included Adi Shamir who had talked
about power analysis attacks the previous day. The normal
implementation of Generalized Frog takes the secret key and
compiles it to actual code that can then be loaded in a smart
card. So I asked Adi if a situation where each smart card actually
holds a different program processing only data would be an
improvement. As expected the answer is no, because one of the
easiest things to read from a card is the "signature" of the
instructions being executed. My other idea, to have a card hold a
string of keys that will be used only once might work with some
applications, such as high security home banking. On the whole
though the situation with smart cards is considered hopeless).
Another big theme was to have not one but several candidates
declared AES. The previous day Miles had explained the main ideas
of Don Johnson's paper (who unfortunately was not present) and
Bruce Schneier had presented a passionate case in favor of a
unique AES. Today several participants spoke in favor of more than
one AES (the main arguments were insurance against a weakness
being found in the main winner and allowing for more specialized
ciphers); I think these arguments won the day. Miles said that
NIST would very carefully consider the advantages that more than
one winner would represent. Having more than one NIST approved
cipher does have another advantage nobody mentioned: it would
quiet those paranoid minds that might suspect that the AES winner
could be a NSA implanted mole.
The question of number of rounds came up. NIST said it would
consider a cipher secure if it could not be broken with 2^64
chosen texts. A thoughtful counter-argument was that different
amounts of analysis result in different apparent minimal secure
rounds. For example, complicated ciphers have an advantage here,
because simpler ciphers allow for more elegant analysis and
therefore attract more cryptanalysis. This gave me the idea that
instead of trying to normalize security (a difficult proposition)
and then comparing speeds, a better methodology could be to
normalize speed (best assembly implementation on one or two
reference platforms) and then compare security. After all,
optimized speed on a given platform gives a good idea about an
algorithm's workload. Also, speed could be normalized at such a
high level that ALL competing ciphers would be pushed into
insecurity which would then allow more or less reasonable
quantitative comparisons to be made about their relative
"structural" strenght. It is clearly too late to use this
methodology in the first round but maybe it could be used in the
second. I feel like sending NIST an official comment in this
sense. I wonder what the reader thinks about this?
At the very end of the session Miles explained what the future of
the AES competition looks like:
Comment period for Round I ends April 15. So if the reader wants
to influence the process, the time to send in a comment is now.
The email address for that is aesfirstround@nist.gov
April 19, the official comments will be published.
May 15, NIST will accept "tweaks" on the algorithms. NIST is the
final judge about what qualifies for a tweak.
Mid-summer, probably by July 99 the finalists will be announced
(probably no more than five but possibly more). At the same time
the justification will be given for their choice. In this point of
time, Round 2 begins.
Now the finalists may submit new optimized code for any platforms
the feel like.
CD-ROMs will be distributed 2-3 months after the start of Round 2
with the new code and most relevant information about the AES
process.
Miles expressed the hope that in Round 2 more serious
cryptanalytic work will be presented.
Efficiency will now be tested also for the 192 and 256 bit
variants.
Hardware performance estimation for all finalists will by supplied
by NSA.
IP issues will be clarified.
January 15, 2000 is closing day for presenting papers for AES3.
Week of April 10, 2000, AES3 will be held in New York.
May 15, 2000 comment period of Round 2 closes.
Approximately August 2000 the winner(s) are announced. The "(s)"
was on NIST's slide.
So who will the winner(s) be? No idea. Many people expressed the
opinion that this may indeed not be a fully technical choice. For
example, it is difficult to imagine the main AES chosen by NIST
for the US government could be a foreign cipher. (By the way, this
makes another point in favor of more than one winner. Jaques Stern
did mention something rather cryptical about a possible European
cipher competition.)
Who will make to the second round? I do feel there is some general
consensus. Here are my choices:
RC6 and MARS will make it for their pedigree not to mention the
fact that they are fine ciphers with probably a lot of work
invested in them.
Rijndael should make on merit alone.
After that, things become somehow less clear.
Serpent is a favorite. It is considered a very conservative
design and everybody agrees that security is indeed the
preeminent concern. Its designers have some of the sharpest
minds in the field and I think NIST will choose Serpent just
for keeping Shamir and Biham in the competition, not to mention
motivate them to attack everybody else.
Twofish is all over the place; it would be cruel to kill it off
so soon. Also it is a very valiant effort and certainly it is a
fine cipher. Extracting the full key out a smart card with
Twofish is widely considered an implementation issue rather
than a weakness of the cipher itself. If you really want to be
logical here, this makes no sense. On the other hand if you
would kick Twofish out for this reason alone then there would a
blood bath with most of the well considered candidates being
thrown overboard too. Twofish does have some good pedigree (it
came out of Blowfish), is fast and flexible on many platforms
and appears to be strong (results to date are not really
significant).
Finally, as a nod to Asia, I think NIST will pick E2 too. It is
certainly a very fine cipher considering its analysis, its
theoretical underpinning and its speed. The Japanese presenters
are probably the most sympathetic and choosing E2 would include
some ladies in the finalist makeup. Also nobody wants to miss
the nice accents.
So here is my guess: MARS, RC6, Serpent and Twofish from the US,
Rijndael from the EU and E2 from Japan.
-30-
--------------------------
<First Response on sci.crypt>
Subject: Re: Live from the Second AES Conference
Date: 24 Mar 1999 16:14:17 +0000
From: Alan Braggins <armb@ncipher.com>
Organization: nCipher Corporation
Newsgroups:sci.crypt
dianelos@tecapro.com writes:
> it appears that the very fastest code is self-modifying - this may
> look "unpure" to some but I think that it is a valid optimization
> technique).
[...]
> improvement. As expected the answer is no, because one of the
> easiest things to read from a card is the "signature" of the
> instructions being executed.
Any ideas whether self-modifying code would make reading instruction
signatures easier or more difficult?
-----
Vin McLellan + The Privacy Guild + <vin@shore.net>
53 Nichols St., Chelsea, MA 02150 USA <617> 884-5548
-- <@><@> --
The following archive was created by hippie-mail 7.98617-22 on Sat Apr 10 1999 - 01:18:50