Mobile app version of vmapp.org
Login or Join
Connie744

: Prevent Google from indexing a duplicate content subdomain without a valid security certificate I have an SSL for a subdomain. The now www version of the website works fine subdomain.example.com.

@Connie744

Posted in: #Google #Https #Indexing #Subdomain

I have an SSL for a subdomain. The now www version of the website works fine
subdomain.example.com. But the www version is a problem subdomain.example.com.
The issue I have with the www version is that google is indexing the www version and visitors clicking on those links are landing on a page telling them the site cannot be trusted.

I have asked my hosting provider if they could put a redirect from the www version to the non www version. this was their reply:


Before the redirect even occurs your browser is relying on being able to make a connection via https to topproperty.thevirtual.network, because this can't be done the redirect doesn''t even get a chance to take place.


So in other words a redirect cannot be done.

If the answer is buy an SSL for the www and non www version for the subdomain.
My hosting provider said that that cannot be done.

I researched this even further and went to Googles subdomain news.google.com.au. This works as it should BUT went to the www version www.news.google.com.au and that comes up with a warning page so it seems from this that a redirect wont work, and that a SSL cannot be purchased for subdomains with a www and non www version

I think I am only left with stopping Google from Indexing the www version of my subdomain. Does anyone know how I can do this?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Connie744

1 Comments

Sorted by latest first Latest Oldest Best

 

@Heady270

Your host is correct that the users would need to accept the security certificate to get the redirect. However, the redirect would prevent Google from indexing the content there. So while there are problems with the redirect, it is still a good idea.

The other solution would be to use meta link rel canonical tags to tell Google the preferred URL for each page. page.html might have the following in the <head> section:

<link rel="canonical" href="http://subdomain.example.com/page.html">


So that Googlebot would know that it should be indexing the version on the subdomain withouth the www, even if it found it could access the page at the duplicate URL with the

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme