Encrypted large files cant decrypt
dshaw at jabberwocky.com
Sat Feb 25 18:12:17 CET 2012
On Feb 24, 2012, at 7:21 PM, Astrid Staufer wrote:
> I encrypt with the folowing command on a server a backup and send it on an
> other server over FTP:
> "tar -czf - $mysql_backup_file $directory_to_backup | gpg --no-tty --batch
> --always-trust --recipient $id_number --encrypt | curl --netrc-optional
> --silent --show-error --ftp-create-dirs --retry 10 -u $ftp_user:$ftp_pwd
> ftp://$ftp_host/$ftp_dir$full_backup_file -T -"
> This works with some Backups around 35Mbytes. But I've tested it also with
> an 2.79Gbyte large Backup-File. It runs to the end, but the final encrypted
> File is corrupt. When I try to decrypt it, it says, "no encrypted data
Since small files work, and 2.79gb doesn't, one thing to check is whether your whole pipeline (including the remote ftp server) can handle large (i.e. greater than 2^31 bytes) files. That would manifest itself as things starting to break at the 2gb mark.
More information about the Gnupg-users