Mobile app version of vmapp.org
Login or Join
Gonzalez347

: Robots.txt destroyed my ranking? I had to do a massive URL change stuff (Categories and Products) on my e-commerce site while keeping 301 redirect of old URLs to new ones. I did change category

@Gonzalez347

Posted in: #RobotsTxt

I had to do a massive URL change stuff (Categories and Products) on my e-commerce site while keeping 301 redirect of old URLs to new ones. I did change category URLs (appr: 800 URLs) to new and improved ones and went live with them; but for an automated (scripted) 301 redirect; I had to get done with new improved Products URLs as well. And to avoid any 404 issues with old category URLs; I didn't want Google to crawl my site until I was done and gone live with Products' new URLs; I put a robots.txt block on my entire site! thinking that I'll have enough time converting all products to new URLs and keeping Google away of the site.

It was intended to be temporary block. When I put robots.txt back to Allow all URLs; things started appearing all wrong! All the site's ranking has gone down!

I am confused; what to do now!


The robots.txt is now allowing all URLs;
301 redirect for old URLs is in place and working.
Entire site has NEW URLs.
I have submitted new sitemaps


And I want to gain the ranking all back to normal. I have submitted new sitemaps and Google's not friendly with them. 99.9% of URLs in sitemap are still saying "URLs blocked by robots.txt".

What did I do wrong? And how's it gonna solve the best way?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Gonzalez347

1 Comments

Sorted by latest first Latest Oldest Best

 

@Speyer207

Never use robots during maintenance

You've created the problem by blocking the site using the robots.txt, you should only ever use robots to inform search engines what to index and what not to index. Because you blocked all URLS it's likely it dropped some from its index meaning you lost your rankings on those pages. Normally this process takes a couple of weeks to kick in. Generally pages that are older will rank better than newer ones due to several reasons such as links, age and so on, using a 301 redirect generally passes 99% of the rankings across but only if those URLS are still indexed.

Correct methods when doing site maintenance


Using the status 503 Service Unavailable (General down message with redirect)
Using the status 307 Temporary Redirect (Redirect to a maintenance page)
Using the status 401 Unauthorized (Temporary Password Authentication)


Recovering from the problem

You should start by finding out if all OLD urls have been dropped, if they have then sadlyly there isn't anything anyone can tell you other than wait and see if your rankings will return, if Google dropped the old urls then those new urls won't rank as good as they would of if you hadn't blocked them using the robots, robots is like using 'NOINDEX' which your informing Google to not index those URLS which is a big mistake on your part. But anyway, Google may return your results depending on how long your URLS were dropped for and if you had any backlinks, if you have lots of backlinks going to the OLD urls then as Google figures out you got new urls it should pass those links to the new URLS and in the meantime you juts need to be patient.

Finding out if Google has dropped my URLS

You should do site: example.com and find out if the old URLS have been dropped, you may find some new ones and if this is the case then its possible they dropped before hand, 301's normally take a week or more to index the new URLS over OLD.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme