S3 backup fails at part 10001 every time on large backup

I have a server with around 300GB of data to backup, creation of TAR works but upload to S3 fails every time with error:

Uploading archive to Amazon's S3 service .. .. upload failed! Part 10001 failed at 52428800000 : Upload failed : HTTP/1.1 400 Bad Request

Amazon only allows 10 000 parts so i guess this is the problem, i wanted to increase the chunk size to overcome this issue but i was told in the forums that this should be handled automatically by Virtualmin, but obviously it is not doing so here.

Please help.

Status: 
Active

Comments

Do you have a single domain which has 300 GB of data? If this is actually spread across multiple domains, you should use the backup mode that puts each domain in a separate file.

Hi Jamie, I do have several domains with over 200GB of data. backup mode is already set to a single file per domain. Is there any way i can force a certain chunk size for all uploadas until a fix is released?

Unfortunately the chunk size is hard-coded at 5 MB currently :-(

I will make this configurable in a future release though, and can send you a beta with that feature if you like?

Hi I am having this issue also, Is there any resolution to this in the near future? I am running virtualmin 4.09GPL

Virtualmin supports increasing the chunk size now.

To do that, you can go into System Settings -> Virtualmin Config -> Backup Settings, and there you can set "S3 upload chunk size".

Great! can you recommend a chunk size? I think my backup is around 100 GB.

It really depends on the total size of the single largest file to upload and the bandwidth available, i think by default S3 uses 15MB chunks, so if you do the math you get:

15MB chunks x 10 000 = 150GB total...

But keep in mind that if a chunk fails to upload then the whole chunk will have to be re uploaded, so... Instead of using 1 chunk of 100 GB, i'd use say a 15 MB chunk size.... that way I'm way under the max parts (10 000) and the chunk size is small enough, and if one of the chunk fails it will just have to retry those 15MB again. Hope this was clear enough...

So, I am 4.17 of of virtualmin pro and I don't have an "S3 upload chunk size" under System Settings -> Virtualmin Configuration -> Backup and Restore and I am getting Uploading archive to Amazon's S3 service .. .. upload failed! Part 10001 failed at 52428800000 : Upload failed : HTTP/1.1 400 Bad Request

What do I do?

What I would suggest is to download and install this tool here:

https://aws.amazon.com/cli/

If that is installed, Virtualmin will use that rather than the built-in code for uploading to Amazon.

If you still have trouble at that point, my suggestion would be to create a new support request, and we can go over that with you there.

In later releases, the S3 chunk size option has moved to Backup and Restore -> Cloud Storage Providers.

installing the aws cli did it for me.

apt-get install python-pip pip install awscli

amigeva's picture
Submitted by amigeva on Thu, 07/23/2020 - 06:27

This 2020 and using Virtualmin version 6.10 and large files fails.

Installing aws cli also solved my problem.

Thank you andreychek