Announcing paperbackup.py to backup keys as QR codes on paper
Gerd v. Egidy
gerd.von.egidy at intra2net.com
Thu Feb 23 11:00:54 CET 2017
> The certificate (aka public key) includes all signatures, all the data
> on the keyserver. It's data you don't really need to back up since it is
> public, and it can be huge. My key.asc file is 137,424 bytes following
> your instructions.
Seems you are trusted by much more people than me ;)
> $ gpg2 --armour --output key.asc --export-options export-minimal
> --export-secret-key [KEYID]
Thank you for your explanation and recommendation. I have adapted the readme
> However, I'm running into a little problem here myself... GnuPG 2.1.18
> does not respect the "export-minimal" option for --export-secret-key,
> only for --export. So if you are using GnuPG 2.1, this will not work as
> This is in all likelihood a bug in GnuPG 2.1, and I'll report it right now.
Thank you for checking and reporting this.
As it will not leave out important information, just add more data that is not
strictly needed, it won't hurt the affected users much. Just a few more dead
> Oh, as an aside, the advantage of paperkey is that it is
> self-describing. No matter what happens, as long as we can still use
> hexadecimal digits to describe binary content (which would be trivial to
> reimplement), we can reconstruct the binary private key file. Using QR
> codes has the disadvantage that if you cannot find a QR-code decoder for
> your platform in the future, reimplementing one is far from trivial. You
> are dependent on QR codes surviving as an actually supported data format.
What timespan are we talking about?
If we are talking decades, I have no doubts that some qrcode decoder will
still be available, even if qrcodes aren't used anymore. There are several
open source decoders available and included in linux distributions. Stuff like
that tends to be available for a long time: you can still download packaged
linux distros like Red Hat Linux 1.0 (released 1995) or Debian 0.91 (released
1994) today, about 23 years afterwards.
If we are talking centuries, I'd worry about the availability of gnupg as much
as qrcodes. Both are publicly available standards, but I don't know if they
are still available and understandable by then. I'd recommend going to
plaintext on glass etched microfiche if you really want to cover that
> Finally, I remember something about QR codes inherently supporting
> splitting data over multiple individual code blocks, specifically for
> data that is too large to fit in a single block. I don't know if it
> supports the number of blocks you need, but you might want to check it
I know of that feature and have deliberately decided against it:
Not all decoders are capable of it, and if one qrcode is missing, the linking
is broken and you have to patch the decoder to still get some data.
I consider the plaintext linking and ordering I used more robust, see
> Also, you say large QR codes are easily damaged by wrinkles and
> deformations. Is this perhaps related to the amount of error correction
> data included? You can tune the ratio of content data to error
> correction data, making a QR code more resilient to damage.
I used the largest error correction ratio possible.
> However, if
> you find that it is not unreadable individual pixels but rather the
> deformation of the total that is throwing off decoders, than I suppose
> the ratio doesn't help: it either can reduce it to the required square
> or it cannot, I'd think.
I haven't studied the decoding algorithms at that level of detail. If the
deformation is irregular, I guess it affects some parts of a code more than
others. Then a higher error correction ration will help.
More information about the Gnupg-users