Mobile app version of vmapp.org
Login or Join
Connie744

: Does too many 301 redirects harm SERP rankings In last 15 months, our website have undergone 3 major structural changes. Due to which we had to create thousands of 301 redirects. We used to

@Connie744

Posted in: #301Redirect #Seo #Serps

In last 15 months, our website have undergone 3 major structural changes. Due to which we had to create thousands of 301 redirects. We used to retain number 1 slot in Google SERP, but in past 14 months our rankings has fallen substantially and it is still falling. Currently our average position has come down to number 6 from number 1. All of our competitors in various range of keywords are lacking substantially in SEO factors.

Just to give you a brief idea:


We are way better in on page seo factors
We are better in technology parameters like AMP, better load speed, better mobile compatibility, etc.
We have high number of backlinks and many links coming from high authority websites.


We have spend a lot of time in our SEO efforts in past 1 year but we are not able to regain our top position. And as we are seems to be better in all SEO factors, only thing which we could think is something bad has happened due to structural changes of our website.

Following are major structural changes that we made in past 15 months.


We changed url structure of around 2500 urls of our website. So we did 301 redirect of old url to new url
We have ecommerce website, and in it we were having around 100,000 product pages. But there was not much difference in these product pages. Because we used to have different product page for each city in which we are serving with only one change from other city and that is price of product. Rest all information of product used to remain same. We thought that this could be a problem of duplicate pages, so we merged all these product pages and this count came down to 5,000 from 100,000. All old 100,000 were made 301 redirect to new pages.
There were around 1000 more pages, which were no longer relevant so we made them 301 redirect to home page.


Following is the info visible in new Google Webmasters > Index Coverage > Excluded Tab



Currently webmaster tools shows 4800 pages indexed. And we have around 110,000 urls in 301 redirect. Please suggest what could be going wrong for us.

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Connie744

1 Comments

Sorted by latest first Latest Oldest Best

 

@Chiappetta492

This is a tricky one. I'm not sure what has caused Google to index your site. All I can offer are my thoughts.

301 redirects should pass 95-100% of page juice. So this shouldn't cause your website to get deindexed from Google. Google seems to think that 45,000 of your pages are duplicate/alternate content. Have you tried using fetch and render on some of those pages to see if Google can resolve them with a 200 status header? I would also use fetch and render and mobile friendly test to see what content Google can view on that page. Is some of your content missing from Googlebot's view that is causing it to see it as a duplicate page?
www.google.com/webmasters/tools/googlebot-fetch https://search.google.com/test/mobile-friendly



The other possible issue is that a Google algorithm update might have really hit your site. This happens to webmasters sometimes. When Google changes its algorithm, some websites get absolutely destroyed in their rankings. This happened with the Panda and Penguin update. Luckily, there's information on what the general algorithmic changes were. So you can see what updates were made and if it looks like you were doing something that the new update is penalizing, you could potentially change it and regain your rankings. I recommend looking at the algorithmic updates on pages like the following:
moz.com/google-algorithm-change


About 16 months ago, I redirected a website with 50,000 indexed pages from example.com/url/page to example.com/url1 and within a few months, Google indexed millions of my pages and increased my search traffic significantly. I found that the content update played a huge role in increasing my rankings. My content improved and the 301s worked.



Ultimately, I'd look closely to see if Google is able to crawl the page properly with fetch and render, and mobile friendly test. I'd then try to figure how why it thinks these pages are duplicates, and what pages does it think they are duplicates of? Is there a way that you can add more content onto those duplicate pages that will encourage Google to crawl and reindex them? Then I would look closely internally and try to figure out if there was some sort of content or link scheme that I partook in in the past and figure out if an algorithmic update struck me because of it, fairly or unfairly.

I'd also look at crawl errors. Are there any issues with why Google can't crawl your page? www.google.com/webmasters/tools/crawl-errors?hl=en&siteUrl=

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme