excessive usage of /dev/random?

Robert J. Hansen rjh at sixdemonbag.org
Sat May 2 07:54:48 CEST 2015


> One assertion (from Robert J. Hansen) implies that a "high school
> math overview of large number theory" suggests that it may well be
> reasonable to require 2400 bits of entropy to generate a 2048-bit RSA
> key.

And unreasonable, too.  I specifically said that I couldn't use it to
argue one side or another, but rather it illuminated the uncertainty of
both sides.  A capsule summary is below.

> The other assertion (From Peter Gutmann) says that it's not
> necessary (with a sarcastic allusion to "numerology")...

I concur with Peter's assessment that it's numerology.  :)

> 1) key generation routines for these problems need an unpredictable 
> source of entropy with which to search the space of possible values 
> to produce a proper secret key.

A 2048-bit number as used in RSA has ~2028 shannons of uncertainty (due
to not every number being prime).  To sort through 2028 shannons of
uncertainty using the general number field sieve requires approximately
2^112 work.  (*Approximately*.)  So I see an enormous disconnect between
the uncertainty of the prime and the work factor that goes into breaking
the key.

We talk about how a key has so many shannons of entropy, but the reality
is different: it has so much equivalent work factor.  If we reduce the
uncertainty of the prime to a "mere" 112 shannons, will that affect the
work factor for the GNFS?

I don't know, and I don't trust my sense of large number theory enough
to even have a good guess.



More information about the Gnupg-users mailing list