How difficult is it to break the OpenPGP 40 character long fingerprint?
Daniel Kahn Gillmor
dkg at fifthhorseman.net
Mon Apr 1 19:46:29 CEST 2013
On 04/01/2013 12:24 PM, adrelanos wrote:
> gpg uses only(?) 40 chars for the fingerprint.
> (I mean the output of: gpg --fingerprint --keyid-format long.)
this is a 160-bit SHA-1 digest of the public key material and the
creation date, with a bit of boilerplate for formatting. This is not
gpg-specific, it is part of the OpenPGP specification:
https://tools.ietf.org/html/rfc4880#section-12.2
A better place to discuss issues related to OpenPGP in general is the
IETF's OpenPGP mailing list:
https://www.ietf.org/mailman/listinfo/openpgp
It is a good idea to review their archives for fingerprints and digest
algorithms before posting, though. Much of what you asked has been
discussed in some detail on that list already.
> How difficult, i.e. how much computing power and time is required to
> create a key, which matches the very same fingerprint?
This is called a second-preimage attack. I am not aware of any
published second-preimage attacks against SHA-1's 160-bit digest that
bring the computation within tractable limits. A theoretically perfect
160-bit-long digest algorithm would require ~2^160 operations to arrive
at a particular digest. SHA-1 is almost certainly not theoretically
perfect against this sort of attack, but does not appear to be
practically broken by anyone who is publishing about it.
> Isn't 40 chars a bit weak?
the underlying material is 160 bits -- it does not need to be
represented as 40 chars. And if the digest algorithm was known to be
weak (e.g. if it was a simple CRC), then even fingerprints 10 times as
long would not be enough.
However, for the purposes of key fingerprints in particular, SHA-1
appears to be reasonable in the near term.
> Are there plans to provide a longer fingerprint which in theory can't be
> broken with computing power expected in for example 100(0) years?
For future OpenPGP drafts, there has been some discussion about moving
to a longer digest (on the IETF list i mentioned above). Those
decisions have not reached a consensus, from what i can tell.
Predicting computing power or the state of mathematics itself 100 or
1000 years into the future seems like a dubious proposition. Consider
the state of mechanical computation and mathematics 100 or 1000 years
ago. Do you think that even a skilled mathematician at the time could
have predicted where we are today?
The longevity of any public key cryptosystem should probably be
estimated in years or decades at the longest if you want any confidence
in your answer.
Regards,
--dkg
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 1027 bytes
Desc: OpenPGP digital signature
URL: </pipermail/attachments/20130401/40ccb15d/attachment.sig>
More information about the Gnupg-users
mailing list