: Website restructuring, 301 redirect or complete de-index? I am managing an e-store site created by another colleague, that was badly created from the start. Because of the structure, pagination
I am managing an e-store site created by another colleague, that was badly created from the start.
Because of the structure, pagination and user browser capabilities combined with lack of meta robots directives to "noindex" certain pages it ended up with over 20k pages indexed in Google.
I am working on a website restructure, to follow all the seo meta robots directives correctly as well as eliminate a lot of browsing capabilities to eliminate tons of duplicated and useless pages.
However, doing bulk 301 redirects of the eliminated urls might(most probably will) cause 404 errors according to google's webmaster's guidelines.
Also, if I remove the pages from the website's internal linking, I will have to manually deindex pages one by one in webmaster's tools as the crawler will not have access to it so that it can read the noindex directives and update it in time.
As an example :
example.com/products - this contains a list of products, paginated until page 2000+ example.com/products/a - this contains a list of products starting with the letter A, paginated to about 30+ pages example.com/products/b - you get the point.
also:
example.com/products/most-viewed - a list of most viewed products ( the same products - paginated to 2000+ example.com/products/top-rated - the same products, paginated to 2000+
As you can see there lots and lots of duplicated content.
I am trying to fix it. So I am implementing rel=next and prev for the pagination, but I also want to remove some useless pages for example the browse product alphabetically. In order to deindex the example.com/products/A, B, etc.
Should I:
a. Eliminate it from internal linking completely and then manually request an url removal for each link from webmasters tools ?
b. Keep it on the website and use meta robots to noindex these pages? I will have to leave it here untill the crawler gets around and updates all the pages accordingly.
c. Remove it from website's internal linking but add the meta robots noindex/follow, but instead of manually de-indexing them I should add them to the url sitemap and submit it to google so that the crawler still knows them and crawls them to read the noindex directive... even though they are no longer linked from the website.
d. 301 redirect to either the products page or home page. But some many bulk directs to a single page or homepage will cause problems and Google can treat them as 404 actually. More info here moz.com/blog/save-your-website-with-redirects If I were to redirect each one to another page that would be something, but I am trying to remove lots and lots of pages.
What would be the logical approach?
More posts by @Speyer207
2 Comments
Sorted by latest first Latest Oldest Best
For restructuring of ecommerce websites, perhaps not on a scale as yours, I would do what Evgeniy says above. What I would also do, from an SEO perspective, is if you have any spam/poor/dodgy links pointing to any of the pages you are going to 404, use a 410 response instead - once Google crawls you a couple of times, it'll disregard the poor links to those pages. This has worked or me plenty of times to great effect. Plus, you dont have to 301 every page ( /A or /B etc) that needs 301'd (I got a feeling that's what you are doing, forgive me if I'm wrong), use htacess or the rewrite equivalent of whatever technology you're using to do it by directory. In addition to UX and having a terrific working website, you want to make Google's crawl as easy as possible, so I'd avoid noindexing/URL removal from GWT if possible to keep your xml sitemap as lean as possible.
by another colleague, that was badly created from the start
my deepest condolences:)
The main point you should consider, is whether pages you want to get rid of have EXTERNAL links. This should be the basis for your decision to 301 or 404 them:
if they have external link, 301 them to save the link equity
if they haven't, or have just a few, 404 them.
The procedure should look like:
create the new, optimized site/url structure
don't mind about internal linking - you will have some drop of search engine visibility in any case, it is usual, if you make such big structure optimization. But the drop is temporal, till Google gets your new structure.
404 all old page, which don't have external links
create counterparts for all old pages, which have external links and 301 old pages to their new counterparts
nothing to noindex! - make and upload the new sitemap into search console and Google will get your new site structure like a charm.
be patient! Google will get your new site structure
if you WANT have such sites, like example.com/product category1/, i.e. for users, then let the first page be indexed, and noindex the pagination.
Note: rel prev next go into the head and have nothing to do with indexation. they are only for crawling.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.