: Google Search Console: Blocked by robots (Screenshot) I just discovered that there was a huge decrease in the number of pages that are blocked by robots in my search console. It is an online
I just discovered that there was a huge decrease in the number of pages that are blocked by robots in my search console. It is an online shop so i blocked many product pages and filters to avoid duplicate content a while back. I wonder why this change has occured and I can't really nail it down.
Can this affect my SEO score? I've seen the AHREFS visibility index go down a little bit. What can I do to clean it up?
More posts by @Heady270
2 Comments
Sorted by latest first Latest Oldest Best
When Google says "blocked by robots.txt" they actually mean "URLs we have wanted to crawl recently, but couldn't because they were blocked by robots.txt".
The decrease could be because they already know about those URLs and know they shouldn't be crawled. The decrease could be because they are not discovering new URLs.
There is also a "Note" on the graph right when the decline starts to happen. Google puts in these notes when they change algorithms that can change how data on the graph looks going forward. It also may be that the algorithm change prevents them from wanting to crawl many of the pages you have blocked by robots.txt
Google Search Console lags quite a bit in reporting. I have over 1000 errors that have been there for over 6 months but when I test the robots.txt in GSC everything is normal. Have you tried looking at the Index Status report? Do you see the same decline in indexed pages during the same time frame?
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.