Mobile app version of vmapp.org
Login or Join
Martha676

: Can't remove blocks on my resources in google webmaster tools I received an alert from Google Webmaster Tools a couple weeks ago that stated: "Google systems have recently detected an issue with

@Martha676

Posted in: #Google #GoogleSearchConsole #Resources #RobotsTxt #Wordpress

I received an alert from Google Webmaster Tools a couple weeks ago that stated:

"Google systems have recently detected an issue with your homepage that affects how well our algorithms render and index your content. Specifically, Googlebot cannot access your JavaScript and/or CSS files because of restrictions in your robots.txt file. These files help Google understand that your website works properly so blocking access to these assets can result in suboptimal rankings."

I found my robots.txt file to possible remove some blocks (it's a WordPress site if that matters).

User-agent: *
Allow: /
Disallow: /wp-admin/
Host: onelittledesigner.com Sitemap: onelittledesigner.com/sitemap.xml

However, despite that change I'm still finding almost 400 blocked resources including many of those that are needed for Google to properly see my page (i.e. stylesheets, scripts, images, etc.).



I'm not sure what could be blocking the resources or how to unblock them. The only instructions Google gave was to fix my robots.txt file, but it appears that the files are being blocked in some other way. Any ideas how to fix this?

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Martha676

2 Comments

Sorted by latest first Latest Oldest Best

 

@Sent6035632

You can re-index your website anytime with Google Search Console. Just simply navigate to Crawl > Fetch as Google and you should see an input field to put a URL. Then just hit Fetch or Fetch and Render depending on your needs. This will send Google Crawl Bots to your website to recrawl.

I typically do this for all new articles after posting and also when I update publish dates or change on page content for a page or post.

10% popularity Vote Up Vote Down


 

@Jessie594

When this sort of issue arises it is important to note that search engines don't crawl your website in real time. Depending on the size and what the search engine classes as the importance of your site your site may be re-indexed anywhere from every few hours (for extremely large sites and very high value sites such as Stack Exchange) all the way to the other extreme of every 4-6 weeks (for very small websites). You can help the process along by re-submitting an updated sitemap to Google or using the Fetch as Google tool which will help add the amended page to the queue for re-indexing however as a general rule there is not much you are able to do to force the re-index before it's turn in the queue. In these instances all you can really do is be patient and wait.

I noted that you mentioned the re-index and reduction in errors after the change to your robots.txt file however Google rarely re-indexes a whole site at the same time, rather it does it in bits and pieces over a period of time. The errors you are still seeing will be historical errors which have not yet been resolved as the whole site was not yet completely re-indexed.

Allow 4-6 weeks and you should see everything clear up then.

(Thanks to @closetnoc for the starting point)

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme