Decrypt problem with large file

Ryan Malayter malayter at gmail.com
Tue Dec 4 17:03:43 CET 2007


On Nov 26, 2007 3:58 AM, Thomas Pries <thomas-pries at web.de> wrote:

> I realized, that I have lost my data :-(.

Our solution for backup encyption has been to use 7zip, since it
encrypts faster and supports segmentation, per-file checksuimming, and
other useful backup-oriented features.

What our scripts do is:

1) generate a random hex symmetric key in memory
2) pipe that imput to GnuPG to encrypt that key (as ascii) into a
small key file on our destination disk disk.
3) Use 7-zip with 2 GB file splits and the random symmtric key to
compress and encrypt the backup files in .7z format from the source to
the destination disk. We use the lowest (fastest) compression
settings, and the 2 GB file splits because reading and writing to 4+
GB files is slow on NTFS and most other UNIX-type file systems. This
is why VMware et. all use 2 GB file splits by defuault.
4) Pad most of the remaining disk space with PAR2 files, for extra
protection against bad disk blocks. We use a very large block size for
par2 - something like 128 Mb, IIRC.

We do over 1 TB of backups per night to removable HDDs with this
setup, and have never had a restore fail. We'eve never even had to use
the par2 files in a real-world restore, but we do test "bad media"
scenarios with them by deleting one of the 7z split files and using
par2 to recreate it.

Backups aren't worth much unless you test restore them to be sure that
they will work. We test all of ours weekly.

As a side note, we looked into using the new encryption options in the
new version of Symantec NetBackup, but we don't have budget for that
upgrade just yet. It would be nice to have it all in one step (even
though NetBackup is closed souce, so trusting the vendor is an obvious
issue).
-- 
RPM



More information about the Gnupg-users mailing list