1gb backup takes 12+ hours

6 posts / 0 new
Last post
#1 Mon, 04/22/2013 - 16:38
paul.kelly

1gb backup takes 12+ hours

I do a weekly backup and then daily incremental.

The full ran last night and took over 12 hours to backup 1gb of data (to Amazon S3).

How can I track the delay/issue in the long backup process?

The confirmation email indicates about 7 hours of backing up, but event that sounds a lot!

Is there a backup log?

Mon, 04/22/2013 - 17:06
andreychek

Howdy,

You can see the backup log in Backup and Restore -> Backup Logs.

Gzip is the default compression method used, but if that was changed to bzip2, that can add a lot of time to the backup.

Also, if there's any directories being backed up that contain a large number of files (over 5000), that's capable of causing a significant slowdown.

-Eric

Tue, 04/23/2013 - 03:14
paul.kelly

Doh! I was in Webmin/logs....! This is the same as the email.

Where do you choose compression type?

I do have at least one domain that will have a lot more than 5000 files, but, surely, this is not uncommon with todays programs?

Also, where are the missing 5 hours?

Tue, 04/23/2013 - 07:40
andreychek

Howdy,

Well, just to clarify -- not 5000 total files, but 5000 files in one directory.

That can happen in certain temp or cache directories, for example. And 5000 isn't actually that bad, but is a sign that there's a growing number of files in a given directory... that's something to keep an eye on.

That is, if you have a temp dir with 5000 files today, it could be 100,000 files a few months from now. And that would definitely cause some slowness during the backup process.

As far as where the compression type is set -- that's in System Settings -> Virtualmin Config -> Backup and Restore.

-Eric

Tue, 04/23/2013 - 15:47
paul.kelly

Aha! Gotcha!

Is there a way I can search folders for file numbers? Googling does not deliver much?

Gzip is being used.

Tue, 04/23/2013 - 16:39
andreychek

You could use a script like this to find directories containing large numbers of files:

#!/bin/bash

find /home -type d | grep public_html | while read FOLDER; do
NUM=`find $FOLDER -maxdepth 1 -type f | wc -l`
if [ $NUM -ge 5000 ]; then
echo $NUM / $FOLDER
fi

done

That will print out directories containing more than 5000 files.

-Eric

Topic locked