Note: I'm using Virtualmin GPL latest version as I just ran package update hoping a new package would fix this. (Not Pro, but since I get a Perl error message I though I'd report it as a bug)
My client hates amazon for some reason, so instead of just using S3 we're going to use Google's S3 copy Google Cloud Storage. Because of S3's popularity every cloud provider has a S3 copy that is compatible with S3. For google this is "Google Cloud Storage". Virtualmin's backup supports Google Cloud Storage and every other S3 copy through the "S3-compatible server hostname" option. I've set this option to "storage.googleapis.com", which is what it needs to be. I've also gone to this address in my browser, and it seems to work just fine without any network problems. On my server, I've created a simple test script in perl to test if it's perl's fault or my server's firewall or something like that. This script seems to work just fine. It returns a usable IO::Socket object, without any exceptions in $!, so perl should be able to connect to google's S3 network just fine.
Perl script: use strict; use warnings; use IO::Socket; use Data::Dumper; my $sock = new IO::Socket::INET (PeerAddr => 'www.ora.com', PeerPort => 80, Proto => 'tcp'); die "$!" unless $sock;
Perl script output: [root@web5 ~]# perl test.pl $VAR1 = bless( *Symbol::GEN0, 'IO::Socket::INET' );
Steps to reproduce: 1. Go to Backup and Restore -> Cloud Storage Providers 2. Click Amazon S3 3. At option 'S3-compatible server hostname' select the radio button next to the text box, 4. Enter "storage.googleapis.com" in the text box 5. Click Save 6. Create interoperability keys for Amazon S3 style connections if needed 1. Inside The Google Cloud Web GUI Click "Three Lines for table of contents" -> Storage -> settings 2. Click interoperability tab 3. Turn on interoperability if needed. 4. Click "Create a new key" to create a new key. 7. Back in Virtualmin Click Backup and Restore -> Scheduled Backups -> Add a new backup schedule 8. Fill in whatever backup settings for testing you want *. My settings are: Destination and Format: Amazon S3 bucket, Access Key & Secret Key should be copy and pasted from step 6, bucket and path "btu-test-backup-%d-%m-%Y, Delete old backups = yes after 1 day, check "Do strftime style time substitutions on file or directory name", check "Create destination directory?", and then I set it to run every day. 9. Click save to save it. 10. Manually test it: Go to Backup and Restore -> Scheduled Backups, and click "backup" to the right of the scheduled backup that you just created under Actions. 11. Double-check settings, scroll down and click "Backup Now". 12. Get this error message:
Starting backup of 3 domains to Amazon S3 bucket btu-test-backup-%d-%m-%Y ..
HTTP/1.0 500 Perl execution failed Server: MiniServ/1.890 Date: Thu, 9 Aug 2018 03:50:41 GMT Content-type: text/html; Charset=iso-8859-1 Connection: close Error - Perl execution failed
File does not exist: 500 Can't connect to storage.googleapis.com:443 (connect: Network is unreachable) at S3/ListBucketResponse.pm line 26
I know some Perl, but the error isn't at that line, since that line calls some function Xmlin() or something like that that's presumably where the problem is.
Again, there shouldn't be any problems with my server connecting to Google's S3 endpoint, because I've tried manually with nc and wget and with the crappy bit of perl code above. So, the problem shouldn't be a firewall or something preventing my server from connecting to google's S3.