Recommended key size for life long key

Robert J. Hansen rjh at sixdemonbag.org
Sun Sep 8 17:07:21 CEST 2013


On 9/8/2013 4:32 AM, Ole Tange wrote:
> The short answer: You do not have to trust projection to use the 
> other findings. If you have a better projection, use that instead.

I do, actually.  If I see that a major part of your write-up is
seriously lacking in rigor, that causes me to suspect the rest of your
write-up is similarly lacking.

> this is only a guess'), but simply to give an estimate on what key 
> size we cannot given our knowledge _today_ say will be broken for 
> sure.

We can't be sure 2048-bit keys will be broken by 2100.  Likewise, it's
within the realm of possibility 4096-bit keys will be broken tomorrow.

Factoring/discrete-log technology has stalled out for the last 20-odd
years after some very promising periods in the late-80s and early-90s.
The dominant technology used today is the General Number Field Sieve,
which was first developed around 1993.

This shouldn't really surprise us.  Factoring is *hard*.  It's provably
an NP problem, which means that (assuming P!=NP) there will never, ever,
ever, be an efficient algorithm for it [1].  We've been looking for
efficient ways to factor ever since Eratosthenes; it is quite likely
there simply isn't one.

So on the one hand, focusing on advances in factoring technology is
suspect.  And on the other hand, you're completely ignoring the
possibility of vast advances in other areas of number theory.  A couple
of years ago Dan Boneh published a paper that rocked a lot of people's
worlds to the core: he proved that *breaking RSA is not equivalent to
factoring* [2].  The RSA Conjecture ("breaking RSA is equivalent to
factoring") is *false*.  Wow.  And since the long-term security of RSA
is predicated on the RSA Conjecture, and the idea there's no other way
to attack RSA than by factoring... along comes Dan Boneh and opens the
door to the possibility of there existing many other mathematical ways
to attack RSA.  Some of them will undoubtedly be worse than factoring.
Some of them may be better.  It's possible one of them might even be in
P.  And how do we project for the future when we cannot predict if,
when, one of these Boneh approaches will be developed?

I am not even scratching the surface of the difficulties involved in
long-term prediction.  I know exactly where my limitations lie in making
long-term predictions.  I doubt you have a better look on the future
than I do -- and given your write-up doesn't even address the factors
that make long-term prediction difficult, I have no confidence
whatsoever in your 87-year projection.

[1] An efficient *classical* algorithm for it.  Although factoring is in
NP, it's also in BQP, which means efficient quantum algorithms exist.

[2] http://crypto.stanford.edu/~dabo/abstracts/no_rsa_red.html

Further:

By 2100, I expect some kind of quantum computation to exist.  87 years
is a long time for technology: it took us fewer than seventy years to go
from the first airplane flight to Armstrong's first steps on the Moon.
It took fewer than thirty-five years to go from ENIAC to the Apple II.

Quantum computing, if-and-when it appears in a large scale (hundreds or
more of qubits in an ensemble), will transform our society in ways that
are hard to imagine.  It is literally science fiction technology.  Right
now, two of the few things we know for a fact is that quantum computers
will be able to efficiently factor large composites and compute discrete
logarithms in finite fields.  That means RSA and Elgamal are both out
the window the instant quantum computing becomes a possibility.

Your write-up barely mentions the possibility of quantum computing, and
says nothing of the consequences should it come to pass.  Even just an
"I arbitrarily project that quantum computing will become possible in
2050 and mainstream by 2060" would be better, because at least you've
drawn a line on the map and said "beyond 2060 there be dragons, and the
world will be radically unpredictable."

What do I mean by "radically unpredictable"?

Imagine that you're an academic in the early 1930s, and you're working
on an estimate of how much computation humanity has done.  You bury
yourself in studies of how many computers (remember, in the 1930s,
"computer" was an occupation) there are today, how much growth there has
been for the field, how many there were in the past, and you project
dramatic exponential growth for the profession of computing.  Then along
comes ENIAC which, in the first fifteen minutes of its operation, did
more computing than had been performed in the whole of human history up
to that point.  All of your estimates are immediately null and void
because the future has bushwhacked you.

*The very first quantum computer with more than two hundred qubits will,
in its very first calculation, do more computation than has been done by
all the world's computers from 1945 to the present.*

Anyone who attempts to predict the future of computing past the
introduction of quantum elements is either (a) a science fiction author
or (b) living in sin.

Your write-up also doesn't mention thermodynamic limits on computing.
The Landauer bound and the Margolus-Levitin limit are well-known
physical constraints on how much energy is used to flip a bit and how
much time it takes, at a given energy level, to flip a bit.  Past
RSA-3072 (which, per NIST, is conjectured to have a 128-bit keyspace),
these physical limits require you to release a level of heat comparable
to a nuclear blast.  The only way to get around these limits is to
reduce the effective keyspace of factoring the composite (which, as
mentioned above, we might never be able to do due to factoring being
solidly in NP) or to find completely new mathematical ways of attacking
the RSA Problem (which, although Boneh demonstrated probably existed, we
have no idea how to do, and if/when we do will possibly completely
invalidate RSA as an algorithm).

> Based on the guess that 10kbit has the potential of not being broken
>  within a person's life span: What problems would you experience if 
> you chose to use a 10kbit key today instead of a 4kbit key (which 
> seems to be the common choice - but which we are fairly certain will 
> be broken before 2100)? THIS (i.e. the problems by using 10kbit
> today instead of 4kbit) being the focus of the document.

Then you're putting the cart before the horse.  If you're basing that on
a *guess*, then my guess is for RSA-16k and Werner's guess is for RSA-8k
and Phil Stracchino's guess is for RSA-4k and... [3]

You first need to provide evidence that your RSA-10k projection has a
good chance of being correct in 87 years.  Once you do that, *then* it
will be time to start looking at timing data and thinking about how best
to adopt RSA-10k today.  But before you do that, your guess is no more
valid than mine, Werner's, or Phil Stracchino's.  And we could all come
up with our own timing data and talk about the need to adopt our
guesses, and the ultimate result would be an awful lot of confusion.  A
great deal of heat and very little light.


[3] These guesses are completely made up, and I'm just using the names
of random people within the community.

> So if you are focusing on the projection of the key size, help me 
> rephrase the text so you do not focus on that, because that has never
> been the intended focus of the document.

That is a necessary precondition to taking the rest of your document
seriously.  Otherwise you're left with just a table of, "hi, I did some
timings on RSA-10k and here's what I found."  It's not very useful.

> Having now addressed that question, please elaborate what other 
> errors you find.

You have not addressed those two questions.





More information about the Gnupg-users mailing list