: "Is robots.txt blocking important pages?" in Search Console regarding static folder This is the contents of my robots.txt file: User-agent: * Disallow: /static/* Disallow: /templates/* Disallow: /translations/*
This is the contents of my robots.txt file:
User-agent: *
Disallow: /static/*
Disallow: /templates/*
Disallow: /translations/*
Sitemap: xx.xxx.com/sitemap.xml
However I don't know why I am getting this error in Google Search Console:
Maybe due the static folder? So the question is, should I remove static folder from the robots?
More posts by @Vandalay111
1 Comments
Sorted by latest first Latest Oldest Best
You shouldn't block Googlebot from accessing your CSS.
Blocked Resources Report - Search Console Help
JavaScript, CSS, and image files should be available to Googlebot so that it can see pages like an average user.
Updating our technical Webmaster Guidelines
Disallowing crawling of Javascript or CSS files in your site’s robots.txt directly harms how well our algorithms render and index your content and can result in suboptimal rankings.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.