All HTTPS and subdomains are matching the default

  • fakemoth
  • 12/10/07
  • Offline
Posted: Sun, 2010-04-18 00:17

Hello, I keep looking for some answers here, i found a lot of things but to no good use. All the HTTPS websites are loading in fact the default website. It's wrong because I reviewed all and "SSL website enabled?" is not checked for any of them. Also any subdomain mail.xxxxxx.ro seems to be working :o on HTTP and HTTPS and loads the default domain.

And everything is indexed in Google including Webmin and Usermin authentication page...

What files do you need posted? It seems it's a recent problem for me, after a few upgrades and resets an restores and others; BTW I'm having the same Apache and Bind config for years and no problems.

EDIT: mkey... disabled SSL for the main site, now web server is still loading an empty page for the others. Checked the Apache virtual servers - so I got these two virtual servers trying to serve files from /var/www/html:

Virtual Server  Any     8443    Automatic   /var/www/html   Open..
Virtual Server  Any     443     Automatic   /var/www/html   Open..

Your server is giving Internal Server errors from yesterday it's a nightmare to post :)

It will be a SEO disastre, how do I redirect the SSL from Virtualmin to the normal HTTP coresponding website. All of them I mean?


RE: SSL sites

  • JamieCameron
  • 10/23/08
  • Offline
  • Wed, 2010-04-21 12:22

Can you give us some example URLs?

You may be seeing a side-effect of the way SSL sites work. If you have only a single IP, you can typically only have a single SSL website. But if you have multiple non-SSL sites and try to access one of those non-SSL sites using an https URL, you will get the page for the site with SSL..


Thanks, I know (1IP per

  • fakemoth
  • 12/10/07
  • Offline
  • Wed, 2010-04-21 13:11

Thanks, I know (1IP per server BTW) but the problem is that SSL is disabled in Virtualmin for all the websites, including the default one. It just loads a blank page from /var/www/html.

As for the mail.i-ware.ro it's the same with i-ware.ro but problem is all the other subdomains like mail.xxxxx.ro from that server loads in fact i-ware.ro...

I'm pretty confused - am I missing something related to the module settings?


Re: SSL

  • JamieCameron
  • 10/23/08
  • Offline
  • Wed, 2010-04-21 13:20

Ok .. so what do you want to happen when users access your different websites in SSL mode?


Can i redirect easily and

  • fakemoth
  • 12/10/07
  • Offline
  • Thu, 2010-04-22 01:10

Can i redirect easily and with no consequences to Webmin/Virtualmin/Usermin all of the https websites to their corresponding http website?


Re: SSL redirect

  • JamieCameron
  • 10/23/08
  • Offline
  • Thu, 2010-04-22 01:15

No .. there's no way to do this. Even if you somehow setup your default site to do the redirect, users would get SSL certificate errors as it would only be able to serve an SSL cert for your default domain.

If you don't have SSL enabled, why are users linking to the https: URLs in the first place?


Thanks, so I'll leave it as

  • fakemoth
  • 12/10/07
  • Offline
  • Thu, 2010-04-22 01:18

Thanks, so I'll leave it as it is. It's not about the users, but about the search engines as stated before. Now is there something I can do about the mail.xxxxx.ro stuff?


Re: SSL

  • JamieCameron
  • 10/23/08
  • Offline
  • Thu, 2010-04-22 10:56

Search engines wouldn't crawl the SSL site unless there are links to it somewhere though, right?

As for the mail.domain.com issue, that is really the same problem .. every domain you create in Virtualmin has several DNS records like www, ftp, mail and so on. These all resolve to the domain's IP. However, only www.domain.com and domain.com have a website associated with them.


robots.txt

  • fakemoth
  • 12/10/07
  • Offline
  • Fri, 2010-04-23 01:41

I made most of the websites and I didn't insert any links anywhere... Ok, so our only chance to remove duplicate content in search engines is not to directly edit Apache directives or BIND records somehow. Seems that robots.txt is the only chance. I will try this, with a second file for https:

  1. mod_rewrite enabled.
  2. Create a second robots.txt, called robots_ssl.txt
  3. Dissalow all robots in this new robots file:

User-agent: *

Disallow: /

  1. Upload that new robots file to the root of the domain.
  2. Add the following to the .htaccess file:

RewriteEngine on

Options +FollowSymlinks

RewriteCond %{SERVER_PORT} ^443$

RewriteRule ^robots.txt$ robots_ssl.txt

I assume it is correct? Now how will I stop the bots to not index also the Webmin and Usermin authentication pages on their corresponding ports? The same recipe?


Robots.txt

  • JamieCameron
  • 10/23/08
  • Offline
  • Fri, 2010-04-23 11:36

The next Webmin / Usermin releases will include a robots.txt file, to block indexing by search engines..