Mobile app version of vmapp.org
Login or Join
Vandalay111

: "Is robots.txt blocking important pages?" in Search Console regarding static folder This is the contents of my robots.txt file: User-agent: * Disallow: /static/* Disallow: /templates/* Disallow: /translations/*

@Vandalay111

Posted in: #Googlebot #RobotsTxt #Seo #WebCrawlers

This is the contents of my robots.txt file:

User-agent: *
Disallow: /static/*
Disallow: /templates/*
Disallow: /translations/*
Sitemap: xx.xxx.com/sitemap.xml

However I don't know why I am getting this error in Google Search Console:



Maybe due the static folder? So the question is, should I remove static folder from the robots?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Vandalay111

1 Comments

Sorted by latest first Latest Oldest Best

 

@Candy875

You shouldn't block Googlebot from accessing your CSS.


Blocked Resources Report - Search Console Help


JavaScript, CSS, and image files should be available to Googlebot so that it can see pages like an average user.

Updating our technical Webmaster Guidelines


Disallowing crawling of Javascript or CSS files in your site’s robots.txt directly harms how well our algorithms render and index your content and can result in suboptimal rankings.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme