: Request Removal of naked domain from Google Index I have a site which was temporarily available at both example.com and www.example.com. All traffic to example.com is now redirected to www.example.com,
I have a site which was temporarily available at both example.com and example.com. All traffic to example.com is now redirected to example.com, however during the brief period that the site was available at the naked domain, Google indexed it.
So Google now has two versions of every page indexed:
example.com www.example.com/about_us example.com/products/something
...
and
example.com
example.com/about_us
example.com/products/something
...
For obvious reasons, this is a bad situation, so how can I best resolve it? Should I request removal of these pages from the index? There is still content at these URLs, but they now redirect to the www subdomain equivalent.
The site has many hundreds of pages, but the only way I can see to request removal is via the Remove outdated content screen in Webmaster Tools, one URL at a time.
How can I request removal of an entire domain (ie. the naked domain) without it effecting the true site located at the www subdomain? Is this the correct strategy given that all the naked domains now redirect to their www equivalent?
More posts by @Ann8826881
2 Comments
Sorted by latest first Latest Oldest Best
Google Webmaster Tools has a setting for it.
Sign in to Webmaster Tools
Add, verify, and select your website
Use the gear icon and select "Site Settings"
From "Preferred Domain", select "Display URLs as example.com
Your 301 redirect solution is correct, but it may take Googlebot a couple weeks to index your site fully and change the search results. I'd expect that the setting in Webmaster tools would take effect in a couple days at the most.
I also wouldn't worry too much about within-site canonicalization these days. Google and Googlebot are much better about detecting duplicate content caused by:
www vs naked domain
/ vs /index.html
directory/ vs no trailing slash
The only time that I would make it my top priority to fix onsite canonicalization issues is if they prevented Googlebot from crawling the site fully. For example having session id parameters in the URL such that Googlebot got a different URL every time it visited the page.
10 years ago, Google didn't do a great job of dealing with the duplicate content caused by these issues. You might see:
Duplicate results in the SERPs
Penalities for duplicate content
PageRank split between two versions of the same page
Today when Google finds duplicate content like this they generally just choose one of the two URLs and treat all links as if they were to that URL. It doesn't cause penalties. It doesn't cause a loss of rankings. It doesn't cause any problems other than users seeing a form of the URL that you might not prefer. See: What is duplicate content and how can I avoid being penalized for it on my site?
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.