Backing up to single archive files on S3: missing .tar.gz extensions

1 post / 0 new
#1 Wed, 10/19/2016 - 13:13
itmustbe
itmustbe's picture

Backing up to single archive files on S3: missing .tar.gz extensions

I want to double-check that I'm getting this right, because after having studied the Backup and Restoration documentation carefully, I feel like something may still be amiss, and it's important to me not to mess up my backups, only to discover when I need them later that I did something wrong!

I want to backup to a single archive file, because that saves me requests to S3 (I have a lot of virtual servers), and I want to stay within their Free Plan limits (plus, if anything ever happened, I'd want to restore all the domains on my server anyway).

First, though, I tried backing up to one file per server (instead of a single archive file), and everything went as expected, with a bunch of .tar.gz files created corresponding to my virtual servers.

But when I try to backup up to a single archive, I get extension-less backups; the backup process says that it completes successfully, but the three files it creates do not have .tar.gz appended. I found various forum threads here that mentioned this, but no solutions.

I also realized (thanks to a Virtualmin error message while experimenting) that when backing up to a single file on S3, one must give that file a name.

So I went back to my backups and put %Y-%m-%d.tar.gz as the filename (after my S3 bucket and a trailing /)... before I just had my S3-bucket/%Y-%m-%d with no extension added.

I ran the backups again and this seemed to work (in terms of naming the three files correctly with .tar.gz in the filenames), but I was suspicious about having put the .tar.gz extension in myself manually. So I downloaded the main .tar.gz file, and my Mac opened it fine, yielding a vast number of contents ending in _web, _dir, _virtualmin, and such (though none of these files had extensions either, except for the SQL files which have .gz appended, and I don't know if they should do or not?)

Am I doing something wrong? I've tried every which way to get a single archive file over to S3 without actually adding the .tar.gz extension in the Bucket and Path field, but to no avail. As I said, I'm just afraid that if I do something wrong now, I'll live to regret it if I ever need these backups!

p.s. The only additional option I have checked is Do strftime-style time substitutions on file or directory name (I tried this process with and without the Create destination directory? option checked, but since I'm not actually creating a directory, just a filename, it had no effect either way).