Mobile app version of vmapp.org
Login or Join
Yeniel560

: Google is not crawling and indexing my site after updating my robots.txt file I have an issue with Google not being able to properly crawl my site. I have read other questions where people

@Yeniel560

Posted in: #Google #GoogleSearchConsole #RobotsTxt

I have an issue with Google not being able to properly crawl my site. I have read other questions where people have had the same issue. I've tried to follow their solution of using this in my robots.txt file:

User-agent: *
Disallow:

Sitemap: www.sonjalimone.com/sitemap_index.xml

I have waited over 24 hours for Google to recrawl my site so I must have something wrong in the robots.txt file. It is a WordPress site if that makes any difference, though I don't see why it would.

Does anyone know what else might cause this issue or is there something wrong with the above?

10.03% popularity Vote Up Vote Down


Login to follow query

More posts by @Yeniel560

3 Comments

Sorted by latest first Latest Oldest Best

 

@Heady270

Google will fetch the robots.txt file itself from your site every 24 hours. If you make changes to your robots.txt file, you must wait a day to ensure that Googlebot picks up your changes.

After it has the correct robots.txt file, Googlebot will start crawling and indexing your entire site properly. As a general rule, I expect to see changes to the documents that Google indexes in about two weeks. If you have a large site, the deeper pages may take as much as a month or two to get recrawled.

Use the fetch as Google feature from "Crawl" -> "Fetch as Google" in Google Webmaster Tools to ensure that Googlebot is able to download the pages that you expect.

You can also use the Blocked URLs tool under "Crawl" -> "Blocked URLs" in Google Webmaster Tools to ensure that Google is seeing the correct version of your robots.txt file and that it can crawl any URL that you specify in that tool.

10% popularity Vote Up Vote Down


 

@Becky754

If you do not want to block bots, then I recommend you remove robots.txt. In my experience it often causes more harm than good due to errors. In the absence of robots.txt, most search engines will crawl your entire site.

Then use Google's webmaster tools to submit a sitemap.

10% popularity Vote Up Vote Down


 

@Kevin317

24 hours is not enough time to wait. Google will almost certainly not be crawling your site that frequently. It could take days or weeks before they re-index your robots.txt and crawl your site again.

And to answer the question you will have next, no, there is nothing you can do to speed that up.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme