: Robots.txt got changed to disallow crawling by accident Our robots.txt mistakenly got changed to: User-agent: * Disallow: / This occurred about 2 weeks ago and was changed back yesterday to
Our robots.txt mistakenly got changed to:
User-agent: *
Disallow: /
This occurred about 2 weeks ago and was changed back yesterday to allow crawling.
Our listed pages in the Googles have decreased substantially, though it looks like Google has already started adding more search results for the site in question since changing the robots.txt back.
Are there long term repercussions of this happening, or will everything get back to normal once the site is crawled again?
More posts by @Turnbaugh106
1 Comments
Sorted by latest first Latest Oldest Best
If Google has started adding pages again, then you are fine. We had this happen with a few sites that were pushed from development into production and their robots.txt was not changed.
In fact, unless you are trying to block pages, I recommend removing robots.txt completely. There's no point in saying allow everything in robots.txt when that is the default action.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.