Mobile app version of vmapp.org
Login or Join
Rambettina238

: How to stop Google from showing erroneous and stale subdomain page in sitelinks For our domain, example.com, we have 2 subdomains: a0.example.com and a1.example.com. These 2 are meant only for

@Rambettina238

Posted in: #GoogleSearch #GoogleSearchConsole #Seo #Sitelinks #Subdomain

For our domain, example.com, we have 2 subdomains: a0.example.com and a1.example.com. These 2 are meant only for product images -- images stored here are accessed like a0.example.com/var1/var2/prod_id_img_1.jpg, and linked from products listed under example.com. There is no other usage of these subdomains.

Before release, we had a page at (a0|a1).example.com saying 'Coming Soon', which we missed out to remove later on (as not accessed from our main site).

Now, we find that when we search for our domain, 'example', then the 1st sitelink is 'Coming Soon' and links to a0.example.com (the other 5 sitelinks shown are valid).

Questions are:


How do I properly get rid of this page so that it doesn't show up on google sitelinks (and any other searches, if any)? Should I just remove it to effect a 404 status or should I run it through 410, or do I need to do anything else -- for a proper removal (for ever)?
Scanning through the nginx logs, I notice that google bot is periodically looking for sitemaps under (a0|a1).example.com. Am I missing any explicit configuration due to which google bot is looking for sitemaps here, or is this behaviour kind of standard and I need not worry?
On webmaster, we have example.com and example.com properties. example.com is not configured but example.com is and is working fine. Do I need to do anything with example.com property that may help with our (s0|s1).example.com usage?


Thank you in advance for your help.

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Rambettina238

1 Comments

Sorted by latest first Latest Oldest Best

 

@Nimeshi995

You have a few options.

1] Remove and allow to all requests to issue a 404 error, however, if this is an index.html within the root of each sub-domain, I would highly recommend not doing this and keeping the file for security reasons.

2] Exclude the page in a robots.txt file within each sub-domain web space. Here is example code to exclude the index.html file within each sub-domain web root.

User-agent: *
Disallow: /index.html


3] Use noindex within the HTML header. Here is example code to place within the HTML <head> tag.

<meta name="robots" content="noindex">


4] Redirect any request for the page to the site home page. Here is example code to do just that. You can put this in your .htaccess file within each sub-domain web root. (Assuming Apache)

Redirect permanent /index.html www.example.com/

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme