Mobile app version of vmapp.org
Login or Join
Turnbaugh106

: Robots.txt got changed to disallow crawling by accident Our robots.txt mistakenly got changed to: User-agent: * Disallow: / This occurred about 2 weeks ago and was changed back yesterday to

@Turnbaugh106

Posted in: #GoogleIndex #RobotsTxt

Our robots.txt mistakenly got changed to:

User-agent: *
Disallow: /


This occurred about 2 weeks ago and was changed back yesterday to allow crawling.

Our listed pages in the Googles have decreased substantially, though it looks like Google has already started adding more search results for the site in question since changing the robots.txt back.

Are there long term repercussions of this happening, or will everything get back to normal once the site is crawled again?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Turnbaugh106

1 Comments

Sorted by latest first Latest Oldest Best

 

@Kevin317

If Google has started adding pages again, then you are fine. We had this happen with a few sites that were pushed from development into production and their robots.txt was not changed.

In fact, unless you are trying to block pages, I recommend removing robots.txt completely. There's no point in saying allow everything in robots.txt when that is the default action.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme