Encryption File Size

Peter Lebbing peter at digitalbrains.com
Fri Feb 24 11:19:07 CET 2012

On 23/02/12 13:00, Johan Wevers wrote:
> No. The files are compressed before encrypting (after encrypting they
> should not be compressible so it has te be before) and the results
> vary.

But isn't there a worst-case overhead for the compression algorithm
used? There most likely is.

>From <http://zlib.net/zlib_tech.html>:
> In the worst possible case, where the other block types would expand 
> the data, deflation falls back to stored (uncompressed) blocks.

zlib, with default settings, avoids increasing the size of the
compressed text. That web page also gives detailed information on overhead.

And *if* (big if) there isn't an acceptable worst-case overhead for a
compression algorithm, there is probably a cut-off in GnuPG, or it would
become a DoS attack vector: get someone to encrypt a specially crafted
file that will fill his filesystem when the compression algorithm is run
on it.

IIRC, there's a cut-off for /de/compression like that.

Furthermore: the ciphertext is enciphered with a streaming mode
cipher, so the ciphertext is as big as the plaintext (after
compression). But obviously there is overhead from the rest of the
OpenPGP message.

And if the size of the plaintext is not known beforehand, you get some
extra headers for blocks of ciphertext in the OpenPGP message. At least,
I believe that is the case. I didn't check now.

The total overhead is small for big files, though.


I use the GNU Privacy Guard (GnuPG) in combination with Enigmail.
You can send me encrypted mail if you want some privacy.
My key is available at http://wwwhome.cs.utwente.nl/~lebbing/pubkey.txt

More information about the Gnupg-users mailing list